The pytorch implementation for the paper
Social Ways: Learning Multi-Modal Distributions of Pedestrian Trajectories with GANs
Javad Amirian,
Jean-Bernard Hayet,
Julien Pettre
Presented at CVPR 2019 in Precognition Workshop (
[arxiv],
[slides],
[poster]
)
This work is, theoretically, an improvement of Social-GAN by applying the following changes:
- Implementing Attention Pooling, instead of Max-Pooling
- Introducing to use new social features between pair of agents:
- Bearing angle
- Euclidean Distance
- Distance to Closest Approach (DCA)
- Replacing L2 loss function with Information loss, an idea inspired by info-GAN
The system is composed of two main components: Trajectory Generator and Trajectory Discriminator. For generating a prediction sample for Pedestrian of Interest (POI), the generator needs the following inputs:
- the observed trajectory of POI,
- the observed trajectory of surrounding agents,
- the noise signal (z),
- and the latent codes (c)
The Discriminator takes a pair of observation and prediction samples and decides, if the given prediction sample is real or fake.
We designed the trajectory toy dataset, to assess the capability of generator in preserving modes of trajectory distribution. There are six groups of trajectories, all starting from one specific point located along a circle (blue dots). When approaching the circle center, they split into 3 subgroups. Their endpoints are the green dots.
In order to create the toy example trajectories, you need to run
$ python3 create_toy.py --npz [output file]
this will store the required data into a .npz file. The default parameters are:
n_conditions = 8
n_modes = 3
n_samples = 768
You can also store the raw trajectories into a .txt file with the following command:
$ python3 create_toy.py --txt [output file]
For having fun and seeing the animation of toy agents you can call:
$ python3 create_toy.py --anim
To train the model, please edit the train.py to select the dataset you want to train the model on. The next few lines define some of the most critical parameters values. Then execute:
$ python3 train.py
$ python3 visualize.py
If you are using this code for your work, please cite:
@inproceedings{amirian2019social,
title={Social ways: Learning multi-modal distributions of pedestrian trajectories with GANs},
author={Amirian, Javad and Hayet, Jean-Bernard and Pettr{\'e}, Julien},
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)},
pages={0--0},
year={2019}
}