/Motion-MDRNN

Generating dance with Mixture Density Reccurent Neural Networks

Primary LanguageJupyter Notebook

Motion-MDRNN

Please view the notebook here!

In the notebook sampling-from-MDRNN.ipynb you can view video examples generated using a Mixture Density Recurrent Neural Network trained on a dataset of improvised dance motion capture data from which it is possible to generate novel movement sequences.
By utilising several different sampling strategies we examine the variations that emerge and explore the effect these strategies have on the generated motion.

Viewing the notebook

Use Jupyter Notebook Viewer to view the notebook with video exampes.

Running the notebook

This is an sos-notebook which can run different kernels in different cells. Please see the docs for installation guides. The trained model can be downloaded here

Read the paper

Read the full publication here for additional details. To cite this work use the following:

@inproceedings{wallace2021exploring,
  title={Exploring the effect of sampling strategy on movement generation with generative neural networks},
  author={Wallace, Benedikte and Martin, Charles P and T{\o}rresen, Jim and Nymoen, Kristian},
  booktitle={Artificial Intelligence in Music, Sound, Art and Design: 10th International Conference, EvoMUSART 2021, Held as Part of EvoStar 2021},
  pages={344--359},
  year={2021}
}


Other dependencies: