radekd91/inferno

Issue with Training EMOTE - Losses Not Converging in Second Stage

whwjdqls opened this issue · 7 comments

Hi there,

First of all, thank you for your incredible work on EMOTE!
I've been experimenting with training EMOTE and encountered some issues during the second stage. Here's a summary of the problem:

Issue Description:
In the first stage, which involves only the vertex level loss, everything seemed to work smoothly. The loss values descended as expected and converged to some stable values. However, when I moved on to the second stage, which includes both disentangle loss and lip reading loss, I noticed that the loss values for vertex level, lip reading, and disentangle started behaving erratically. They don't seem to descend well, and instead, they vibrate or fluctuate.

My Question:
I'm wondering if you, or anyone else using EMOTE, have encountered similar issues during the second stage of training.

Maybe I have messed up with implementing a custom renderer using pytorch3d,,,I'm not sure hence the issue.

Thanks in advance for any insights or assistance you can provide!

Hi there,

thanks for you interest. You make very good observations. The losses can indeed behave a little strange. In my experience with the model, I have only finetuned the second stage for two epochs and maybe even one is enough and so I haven't analyzed the convergence of the perceptual losses beyond the fact that they have a clear impact on the result - do they in your case?

Because of the finetuning of with the differentiable renderer, the second stage is quite slow and so I never finetuned for long. This is also not necessary since the model is well converged from stage one and only needs a little "nudge" by the perceptual losses.

To conclude, I would not worry about the seemingly confusing convergence of the perceptual losses as long as you can see their impact on the result. If you want, paste some convergence plots here and I can take a look and tell you if this looks broken or not.

Thanks for the prompt reply.

These are the plots of the loss in the second stage.
image
image

As you can see, all the losses seem to vibrate. Although small batch size like 4 can lead to noisy loss, it is hard to see any kind of convergence. If you can take a look it will be wonderfull!

Hi there,

First of all, thank you for your incredible work on EMOTE! I've been experimenting with training EMOTE and encountered some issues during the second stage. Here's a summary of the problem:

Issue Description: In the first stage, which involves only the vertex level loss, everything seemed to work smoothly. The loss values descended as expected and converged to some stable values. However, when I moved on to the second stage, which includes both disentangle loss and lip reading loss, I noticed that the loss values for vertex level, lip reading, and disentangle started behaving erratically. They don't seem to descend well, and instead, they vibrate or fluctuate.

My Question: I'm wondering if you, or anyone else using EMOTE, have encountered similar issues during the second stage of training.

Maybe I have messed up with implementing a custom renderer using pytorch3d,,,I'm not sure hence the issue.

Thanks in advance for any insights or assistance you can provide!

Hi, I'm coming from another question, MEAD dataset processed metadata.pkl file(https://download.is.tue.mpg.de/emote/mead_25fps/processed/metadata.pkl) can't be downloaded, can I trouble you to share a copy? Thank you very much.

I did not use the data provided by this repo, I'm afraid I cannot help with your situation

@sogoojoy , I believe this issue should now be fixed. Please let me know if it works

@sogoojoy , I believe this issue should now be fixed. Please let me know if it works

hello, this file (https://download.is.tue.mpg.de/emote/mead_25fps/processed/metadata.pkl) still can't be downloaded

When I tried to train Emote with my dataset, I found some problems. Would it be convenient to get your contact information and communicate with you? Thanks, please! @whwjdqls

My email: shirley_x0708@163.com