the NET seems not working well with pretrained weight replknet31_base_224_pt1k_basecls.pkl
dejavudejavu opened this issue · 1 comments
dejavudejavu commented
I put two dichotomies classification datasets into training,one about construction crack and background and the other about cat and dog,both experiments were conducted with default parameters and replknet31_base_224_pt1k_basecls.pkl.The result shows that the loss didn't decline and the evaluation accurracy stay unchanged, the same as the percentage of one particular class,which means the RepLKNet had no classification capability after efficient training.
percentage of cat pics is also about 49%
DingXiaoH commented
Hi, this seems a BaseCls model, so please raise an issue at the MegEngine version's repo https://github.com/megvii-research/RepLKNet/issues