/M5-methods

Data, Benchmarks, and methods submitted to the M5 forecasting competition

Primary LanguageJupyter Notebook

M5-methods

Benchmarks and winning methods of the M5 forecasting competition

"validation": Code used for producing the forecasts of the benchmarks (both of "Accuracy" and "Uncertainty" competitions).

"Scores and Ranks.xlsx": Scores and ranks of the top 50 submissions of the M5 "Accuracy" and M5 "Uncertainty" competitions. The scores of the benchmarks are also provided.

"M5-Competitors-Guide.pdf": Provides information about the set-up of the competition, the data set, the evaluation measures, the prizes, the submission files, and the benchmarks.

The following link includes the abovomentioned items PLUS:

"Dataset": The data set of the competition, i.e., unit sales (train and test set) and information about calendar, promotions, and prices. The data set is also available for R users in a .Rdata format.

"Accuracy Submissions": The forecasts of the 24 benchmarks of the M5 "Accuracy" competition and the submissions made by the top 50 performing methods.

"Uncertainty Submissions": The forecasts of the 6 benchmarks of the M5 "Uncertainty" competition and the submissions made by the top 50 performing methods.

https://drive.google.com/drive/folders/1D6EWdVSaOtrP1LEFh1REjI3vej6iUS_4?usp=sharing