orangeduck/Motion-Matching

How much the loss is acceptable?

Maekdzp opened this issue · 2 comments

I'm a rookie in neural network, so the question may be stupid. I changed the training data from referenced dataset and found the loss of decompressor reaches 1.9 after 500k iterations. Generated .bvh files also shows obvious difference. So I wonder if I choose bad training data and what can I do to reduce loss.

Ultimately the visual result is what matters when deciding how much difference is acceptable. I'm surprised you get an obvious difference on the referenced dataset. When I train this locally yes there is a small different between the generated bvh files but it is extremely subtle. Did you change any of the parameters or data for training?

Yeah, I changed batch size to 256 and use cuda instead of cpu to accelerate training. But it should not affect the training result too much. I have tried several times using different pieces of dataset and reached better results. Maybe I just used bad datasets when I tried first time. Thank you for your reply.