/KDDCup2022-Baidu

3rd place solution of Baidu KDD Cup 2022 - Spatial Dynamic Wind Power Forecasting

Primary LanguagePython

KDD Cup 2022 - Baidu Spatial Dynamic Wind Power Forecasting

This is Teletraan team solution for Baidu KDD Cup 2022, winning 3rd place in 2490 teams. The task is to predict the wind farm's future 48 hours active power for every 10 minutes.


Solution summary

  • A single BERT model is made from the tfts library created by myself
  • Sliding window to generate more samples
  • Only 2 raw features are used, wind speed and direction
  • The daily fluctuation is added by post-processing to make the predicted result in line with daily periodicity

How to reproduce it

  1. Prepare the tensorflow environment
pip install -r requirements.txt
  1. Download the data from Baidu AI studio, and put it in ./data/raw
  2. Train the model
cd src/train
python nn_train.py
  1. The inference code is located in ./submit. The file result.zip created in ./weights/ can be submitted.

Contributor

Reference

  • [1] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
  • [2] Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: De-composition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems 34 (2021), 22419–22430.
  • [3] JingboZhou,ShuangliLi,LiangHuang,HaoyiXiong,FanWang,TongXu,Hui Xiong, and Dejing Dou. 2020. Distance-aware molecule graph attention network for drug-target binding affinity prediction. arXiv preprint arXiv:2012.09624 (2020).
  • [4] HaoyiZhou,ShanghangZhang,JieqiPeng,ShuaiZhang,JianxinLi,HuiXiong, and Wancai Zhang. 2021. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 11106–11115.