This is a Pytorch implementation of IDA-LSTM, a recurrent model for radar echo extrapolation (precipitation nowcasting) as described in the following paper:
A Novel LSTM Model with Interaction Dual Attention forRadar Echo Extrapolation, by Chuyao Luo, Xutao Li, Yongliang Wen, Yunming Ye, Xiaofeng Zhang.
Required python libraries: torch (>=1.3.0) + opencv + numpy + scipy (== 1.0.0) + jpype1. Tested in ubuntu + nvidia Titan with cuda (>=10.0).
We conduct experiments on CIKM AnalytiCup 2017 datasets: CIKM_AnalytiCup_Address or CIKM_Rardar
Use any '.py' script to train these models. To train the proposed model on the radar, we can simply run the cikm_inter_dst_predrnn_run.py or cikm_dst_predrnn_run.py
You might want to change the parameter and setting, you can change the details of variable ‘args’ in each files for each model
The preprocess method and data root path can be modified in the data/data_iterator.py file
There are all trained models. You can download it following this address:trained model
We give two approaches to evaluate our models.
The first method is to check all predictions by running the java file in the path of CIKM_Eva/src (It is faster). You need to modify some information of path and make a .jar file to run
The second method is to run the evaluation.py in the path of data_provider/CIKM/
5 frames are predicted given the last 10 frames.
Besides, we also offer some prediction results of models including ConvGRU, TrajGRU, PredRNN++ and MIM Download Address