microsoft/archai

[BUG] why is training_epochs a float in algos.ipynb

lovettchris opened this issue · 2 comments

Describe the bug

This code looks weird to me:

partial_tr = PartialTrainingValAccuracy(training_epochs=0.001, progress_bar=True)

I've never thought of the #epochs as a floating point number before, what is the value of making it a float?

I don't think I have a clear hunch, but I would say that the plan was to have a float that designed a percentage of the full training?

Something like this:
training_epochs=10, partial_training_ratio=0.1

@piero2c do you know something about this?

Hi @lovettchris and @gugarosa, training_epochs=0.001 means training for 0.1% of a full epoch. I added an inline comment to explain this a little bit better. Thanks