IGNF/FLAIR-2

Reproduce baseline results to calculate the inference time

Closed this issue · 2 comments

I'm currently attempting to measure the inference time of the NVIDIA A100 GPU using the configuration file flair-2-config.yml. However, the result I managed to achieve was 0.5218, which falls short of the officially reported result of 0.57580. It appears that I might be missing some crucial configuration settings. Any guidance or suggestions would be greatly appreciated.

Thank you in advance for your assistance.

Thank you for your inquiry @ivicadimitrovski . Indeed, a score of 0.52 falls well below the results we have achieved. The configuration of the baseline is explained in the datapaper. One potential difference is the effective batch size, as the baseline utilized a cluster of 12 GPUs with a batch size of 10. The seed was also fixed at 2022. Aside from that, the hyperparameters of the baseline are equivalent to the flair-2-config.yml file.

Thanks @agarioud for your answer, we have the same "probleme" with my teammate to reach the baseline score with just you baseline config, we'll try with same number of GPU and batch size !