Music Generation through interactive dancing with Deep Learning.
- Kinect (v1)
The dependencies considerer here are only for running the pre-trained models both through Interpreter.py
and dream.py
:
- Ffmpeg
- Pthread (
sudo apt-get install libpthread-stubs0-dev
) - Libfreenect
- Python3.4+
- Numpy
- Tensorflow
- OpenCV2
- Pure Data Extended
First get the trained data from here
Then clone this repo, unzip the trained data and cd
to the project:
unzip experiments.zip
git clone https://github.com/aristizabal95/SinestesIA.git
cd SinestesIA
This step is only necessary if interpreter.py
is to be used
simply run gcc -Wall -o bin/main src/*.c -lpthread -lfreenect
to compile the program
The Interpreter takes data from the Kinect in real-time, and generates sound instructions to be sent to Pure Data. To run this script you must
- Open
pd/performer.pd
with Pure Data - Have the Kinect running with
./bin/main
- Start the Interpreter with
python3 mains/interpreter.py
The mains/dream.py
script generates sequences of dance and music. To run it you must
- Open
pd/performer.pd
with Pure Data - Start the Dream generator with
python3 mains/dream.py
- Optionally, set the duration of each dream with the argument
-l
(default: 150) and use-r
to specify wether the program should add random influences to the dream generation or not (default: 1) Example:python3 mains/dream -l 300 -r 0 # Make the length of each dream 300 and disable randomness in the dreams