/M5-methods

Data, Benchmarks, and methods submitted to the M5 forecasting competition

Primary LanguageJupyter Notebook

M5-methods

Benchmarks and winning methods of the M5 forecasting competition

"validation": Code used for producing the forecasts of the benchmarks (both of "Accuracy" and "Uncertainty" competitions).

"Scores and Ranks.xlsx": Scores and ranks of the top 50 submissions of the M5 "Accuracy" and M5 "Uncertainty" competitions. The scores of the benchmarks are also provided.

"M5-Competitors-Guide.pdf": Provides information about the set-up of the competition, the data set, the evaluation measures, the prizes, the submission files, and the benchmarks.

The following link includes the abovomentioned items PLUS:

"Dataset": The data set of the competition, i.e., unit sales (train and test set) and information about calendar, promotions, and prices. The data set is provided for the validation (public leaderboard) and evaluation (private leaderboard) phases of the competition separately. The weights used for computing the scores (WRMSSE and WSPL) are also provided per case.

"Accuracy Submissions": The forecasts of the 24 benchmarks of the M5 "Accuracy" competition and the submissions made by the top 50 performing methods.

"Uncertainty Submissions": The forecasts of the 6 benchmarks of the M5 "Uncertainty" competition and the submissions made by the top 50 performing methods.

"Working papers": Working papers describing the setup and data set of the M5 competition, as well as the results, findings and winning submissions of the "Accuracy" and "Uncertainty" challenges.

https://drive.google.com/drive/folders/1D6EWdVSaOtrP1LEFh1REjI3vej6iUS_4?usp=sharing