jinglescode/papers

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

jinglescode opened this issue · 0 comments

Paper

Link: https://arxiv.org/pdf/2012.07436.pdf
Year: 2021

Summary

  • reduce space complexity: query sparsity measurement
  • reduce time complexity: ProbSparse
  • predict sequence in one batch: generative style decoder (decoder generates long sequences with 1 forward pass)

Methods

image

Results

  • same time and space complexity and reformer, but only 1 for inference
  • significantly better results than RNN/LSTM
  • outperform Reformer
  • achieves better results than DeepAR, ARIMA and Prophet on MSE