/midi-model

Midi event transformer for symbolic music generation

Primary LanguagePythonApache License 2.0Apache-2.0

Midi-Model

Midi event transformer for music generation

Updates

  • v1.3: MIDITokenizerV2 and new MidiVisualizer
  • v1.2 : Optimise the tokenizer and dataset. The dataset was filtered by MIDITokenizer.check_quality. Using the higher quality dataset to train the model, the performance of the model is significantly improved.

Demo

Pretrained model

huggingface

Dataset

projectlosangeles/Los-Angeles-MIDI-Dataset

Requirements

  • install pytorch(recommend pytorch>=2.0)
  • install fluidsynth>=2.0.0
  • pip install -r requirements.txt

Run app

python app.py

Train

python train.py

Citation

@misc{skytnt2024midimodel,
  author = {SkyTNT},
  title = {Midi Model: Midi event transformer for symbolic music generation},
  year = {2024},
  howpublished = {\url{https://github.com/SkyTNT/midi-model}},
}