NVlabs/conv-tt-lstm

Own dataset

Closed this issue · 2 comments

Hi, This is very interesting work.

is there an example I could use for feeding in my own data/videos? a demo.py or a ipynb that shows how to modify your work for other datasets or for inference?

Thanks

Can you provide a link to load the moving minist2 dataset?

Sorry for the late reply.
For Moving MNIST-2, we generated the dataset using this code: https://github.com/jthsieh/DDPAE-video-prediction/blob/master/data/moving_mnist.py.
The dataset is saved in npz.

For KTH action, we used the raw video files directly.
You just need to convert each video file to npz, then use code/dataloader.py to load and process the data.

  • Video loading: line 79
  • Dada processing: line 81-87