/MMM_Mutopia_Guitar

Process to train a GPT-2 model from scratch using Hugingg Face. The dataset is built from the Mutopia Project.

Primary LanguageJupyter NotebookMIT LicenseMIT

MMM Mutopia Guitar

Tutorial to train a GPT-2 model from scratch using Hugingg Face and publishing it as a Gradio demo using Spaces. The model generates guitar music. For encoding the guitar MIDI files of the Mutopia Project, I am using the excellent implementation of Dr. Tristan Beheren of the paper: MMM: Exploring Conditional Multi-Track Music Generation with the Transformer.

By the end of this tutorial, you should have a Gradio demo similar to this one.

To start the tutorial, please visit the first notebook: 1. Collecting the data.

The dataset is built from the Mutopia Project.

This tutorial is a work in progress. You can take a look at Hugging Face at the following:

After generating the music in the Hugging Face widget, you can listen to the results using this notebook.