facebookresearch/ConvNeXt-V2

ImageNet 22k(21k) Traning loss at the end of training

Opened this issue · 0 comments

i am training ConvNext-Tiny on ImageNet22k(winter. not fall). with below condition in timm (train.py)

python -m torch.distributed.launch --nproc_per_node=4 main.py
--model convnext_tiny --drop_path 0.1
--batch_size 64 --lr 4e-3 --update_freq 16
--warmup_epochs 5 --epochs 90
--data_set image_folder --nb_classes 19167 --disable_eval true
--data_path /path/to/imagenet-22k
--output_dir /path/to/save_results

i have question. what is the training loss at the end of training? Thank you

image (this is the training loss plot in epoch 15)