Memory leaked when the model and trainer were reinitialized
Opened this issue · 3 comments
earthlovebpt commented
tomaarsen commented
Hello!
I'm able to reproduce this. I'll try and see which object is not being cleaned up as it should be.
- Tom Aarsen
earthlovebpt commented
Thank you!!!
Are there any alternatives to avoid reinitializing the model and trainer?
tomaarsen commented
When doing seed optimization? Not really - this is the safest option. During "normal" training, you usually only have to initialize a model and trainer once.
I've discovered the issue, and will be creating a patch soon. In the meantime, you can add this line right at the end of the for-loop code block instead of the del trainer, model
.
model.model_card_data.trainer = None
It should help a lot with the memory usage.
- Tom Aarsen