/Sonification-Tutorials

JupyterNotebooks detailing methods of Sonification to Midi in Python

Primary LanguageJupyter Notebook

Hello!

Here is a small project to show some sonification techniques and code I developed in an effort to better turn the world around us into sound.
This repository will probably only be shared directly to people I know so I'll be brief in my descriptions.

Dependencies: The following is needed to run the code in this Repository, all code is written in Python 3

- Pandas
- Numpy
- Matplotlib
- MidiUtil: Here is a link to their docs. Note that this dependency is not friendly with Condas and prefers to be installed via Pip.
    https://midiutil.readthedocs.io/en/1.2.1/

Note: If you cannot get MidiUtil to function, the code can still be used to output lists of midi notes. You will then simply need to use 
some other midi writer. MIDO, is a good alternative.

Description of Notebooks:
1. Turns Random Data into Midi Notes
2. Turns Random Data into Midi Notes with Functions
3. Turns a 3D Brownian Motion into 3 sets of Controller Change Values
4. Turns a basic 3D surface into sound.
5. Turns a complex 3D surface into sound.
6. Takes real data from the HJ Andrews Research Forest and outputs Midi to be made into Scores.

Here is a link to a google drive with all the Sonifications from file 6.
https://drive.google.com/drive/folders/14w3dRCGiSovdPP72jxExtOEkS37EyUn1?usp=sharing

Here is a soundcloud link to some of the sonifications from file 6.
https://soundcloud.com/illb3bach/sets/mckenzie-river-symphony

The output of each notebook will have a corresponding mp3 file (wav is too big), that shows one method of taking the midi data and putting it into a daw.
I will put a short description of what I did in my Daw to help connect the dots between the outfile and the final sound.

Final Thoughts: This was primarily made for me to begin making a robust toolkit for making Music based on reality and the data we measure.
In time as I improve I hope to make Space symphonies composed using the themes generated by data, have Forest Raves where the EDM is made from the 
data of populations moving and changing, etc etc. I feel much of Sonification has focused on the scientific aspects of it's use, I think it can also be
aesthetically pleasing without too much compromise to accuracy. As a vessel for communicating science it is incredible, and this attempts to make sonifications
that don't sound like bad computer music, but that could be listened too and appreciated without even knowing the music was made by something else.

I think it would be incredible to use aspects of sonification with standard (or wild) composing methods. To go to a modern symphony and hear the music of
the planets, or the mountains, or even of small streams nearby as I do in this one. It brings people closer to the science, brings them into new ways of considering
music as information, and finally maintains the emotional aspect of art.

If you have questions or wish to chat please email cconaway [at] ucsd.edu, or conawaychristian {at} gmail.com