/ComposeForChucK

Compose is a Framework to create music for the programming language ChucK

Primary LanguageChucK

ComposeForChucK

Compose is a Framework to create music for the programming language ChucK. This is a weekend project inspired by the idea to create a platform to compose artificial intelligent music

Important Note

This framework is still a work in progress, it was started as a weekend project that need hours of work. If anyone is interested in collaborating or helping develop this framework, it would be awesome.

Introduction

Compose is a Framework that offers high-level operations to write music with Chuck. From playing certain notes, to working with scales and modes y creating artificial improvisation.

Please feel free to modify and use Compose in your musical projects. This first prototype of the framework allows the user to any note, to choose scales and play scales. It offers a starting point to work on a base of musical theory. All the notes have been program, the most common scales and modes have been programed. The idea is to use this framework the create theoretical complex music with Chuck

Installation

If you do not have installed chuck, the following link has everything you need: http://chuck.cs.princeton.edu/

To install the Compose framework:
- Open miniAudicle
- Start the virtual machine
- Run Compose code
- Now you have your framework up and running, you can write your program in another tab and work with the Compose Framework
Note: every time you run a program that uses Compose, you will need to run the Compose code first

Future

Compose started as a weekend project with the idea to create an intelligent program that created music by itself. That the user would input a certain feeling and the program would create a musical piece that would interpret this feeling. This at first could a description of a human feeling like sad, happy, excited, etc. Or another idea could be to translate a color that the user feels describes his mood.

The final masterpiece of this program would be a program where the users puts on a neuro head set and the program creates a musical piece that interprets the user in real time. The neuro head set could pick up various inputs that could be translated into a combination of notes and melodies. Another important input could be the users pulse rate. The beats per minute of the musical beat could be synchronized in real time with the users pulse rate. This could vary drastically a mood of the musical piece.

The Mona Lisa:
Imagine a program like the one described as the final masterpiece. Now image using this program as an artistic experience. Now imagine taking it to the next level, and putting on an Oculus Rift that gives you amazing visuals that interpret the music that is playing. An Artistic Experience that represents yourself. It would be like looking in the mirror of your inner self.

Today is a gift, that’s why we call it present:
Let´s interpret the present moment. Another, taking it to the next level idea... Imagine a system that could interpret the present moment. The system installed in a box with sensors and speakers. And wherever you take this magic box, it played a never ending song that interprets its surroundings in real time. If its raining, cold, worm, windy, sun, moon, darkness, whatever the system interprets from the current moment in time and space. Created into music. wow...

##Methods I divided Compose into three sub groups, depending on how high or low level the logic of the methods. ####Working with notes TODO

####Working with scales and modes TODO

####Artificial Improvisation TODO

##Example: TODO