StatMixedML/XGBoostLSS

Out of MemoryError

Closed this issue · 2 comments

Hi, I'm trying to train the XGBLSS on the m5 dataset but I keep on getting out of MemoryError with the hyper_opt method. Is there a parameter to reduce the amount of memory used for this method?

@DaanFerdinandusse Thanks for your interest.

Indeed, using cross-validation in combination with the M5 data can easily lead to OOM. I am afraid there is not one parameter to alleviate the problem.

One thing you can try is to using distributed training via Dask or Ray. It is not readily implemented and for them to work it is necessary they support custom evaluation and metric functions. It would definitely be a nice feature to add to the framework.

If this is too much of a work-around, you can try and subset the data or use the eval_set option in the train function.

Hey @StatMixedML, Thank you for your response. I will try out the methods you suggested :)