Loss is not going down hw1.py
jmamath opened this issue · 3 comments
jmamath commented
Hello,
I tried to implement the assignment of HW1 by myself, and I was getting a flat loss and a flat accuracy around 20% in the default case, i.e: num_classes=5
, num_classes=1
, meta_batch_size=16
.
Now I used your tensorflow implementation and got the same result.
Is this expected ?
Thanks.
Luvata commented
It's expected ! But the losses actually start to go down from epochs 3000 ~ 4000
How many epochs did you train your model ?
jmamath commented
Ok I see. I trained it just for 3200 epochs. When training for the full 50,000 epochs I can see the network learning. Thanks.
Luvata commented
Glad it works! Peace.