/MGT-python

Musical Gestures Toolbox for Python

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

MGT-python

PyPi version GitHub license CI Documentation Status

The Musical Gestures Toolbox for Python is a collection of tools for visualization and analysis of audio and video.

MGT python

Usage

The easiest way to get started is to take a look at the Jupyter notebook MusicalGesturesToolbox, which shows examples of the usage of the toolbox.

Open In Colab

The standard installation via pip: paste and execute the following code in the Terminal (OSX, Linux) or the PowerShell (Windows):

pip install musicalgestures

MGT is developed in Python 3 and relies on FFmpeg and OpenCV. See the wiki documentation for more details on the installation process.

Description

Watch a 10-minute introduction to the toolbox:

Video

MGT can generate both dynamic and static visualizations of video files, including motion videos, history videos, average images, motiongrams, and videograms. It can also extract various features from video files, including the quantity, centroid, and area of motion. The toolbox also integrates well with other libraries, such as OpenPose for skeleton tracking, and Librosa for audio analysis. All the features are described in the wiki documentation.

History

This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max.

The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.

Reference

If you use this toolbox in your research, please cite this article:

Credits

Developers: Balint Laczko, Joachim Poutaraud, Frida Furmyr, Marcus Widmer, Alexander Refsum Jensenius

License

This toolbox is released under the GNU General Public License 3.0 license.