huggingface/notebooks

PatchTSTMixer fix

henryennis opened this issue · 4 comments

context_length parameter is not being set resulting in scaling issues when inferencing and training the model.

What does this PR do?
The context_length param of the TimeSeriesPreprocessor is not being set correctly defaulting to a context_length of 64.
The PR fixes patchtstmixer to be able to be inferenced properly when contructing a pipeline using a pretrained TimeSeriesPreprocessor.

I have a fix on my fork of notebooks.

@NielsRogge

can you also add it to the blog? https://github.com/huggingface/blog/

Sure, no problem.