PatchTSTMixer fix
henryennis opened this issue · 4 comments
henryennis commented
context_length parameter is not being set resulting in scaling issues when inferencing and training the model.
What does this PR do?
The context_length param of the TimeSeriesPreprocessor is not being set correctly defaulting to a context_length of 64.
The PR fixes patchtstmixer to be able to be inferenced properly when contructing a pipeline using a pretrained TimeSeriesPreprocessor.
I have a fix on my fork of notebooks.
henryennis commented
kashif commented
can you also add it to the blog? https://github.com/huggingface/blog/
henryennis commented
Sure, no problem.
henryennis commented