This is the codes repository for the project "Emotion-guided Piano Accompaniment Generation".
Baseline models are from Accomontage Repo.
Training and validation set is derived from POP909 dataset and emotion-guided music generation test set is a part of Nottingham Dataset.
Harmonization module is updated (2023/2/2) The paper is accepted at International Joint Conference on Neural Networks 2023(IJCNN2023).
https://arxiv.org/abs/2307.04015.
-
Click here to download the data folder and put them in
./data
. -
Put the melodies (MIDI Format) in the
./original_midi
; -
Run
demo.py
; -
Wait a while and the accompaniment will be saved in the
./generate_midi
. -
For more emotional flow guided generation samples, please refer to https://soundcloud.com/ko9isjyplxrb/sets/demos-of-emotion-guided-generated-accompaniment.
-
Click here to download the processed dataset and put them in
./data
. -
Run
train.py
-
Log files will be automatically generated to the root directory, please use tensorboard related commands to monitor the training process.
-
Q: Sometimes the generated chords dont fully harmonize the melody. How come?
A: The randomness and controllabilty of LSTM-based harmonizer isnt satisfactory. Use music theory for the harmonization instead. You can rectify the
solve()
function from./utils/Accompaniment_Generator.py
for your personal usage.