WebAudio/web-midi-api

MIDI 2.0

rektide opened this issue · 8 comments

Hello hello.

There's a lot of new features in the recently approved MIDI 2.0 specification. Would love to see these features make their way to the web.

I'm not sure what my intents are here, but I started a web-midi-2 repo for this initiative & have raised this idea with @WICG.

@rektide have seen that UMP specs have been released?
USB MIDI 2 is also likely to be released soon.

UMP presents some interesting challenges especially with JS. UMP Packets come in 32, 64,and 128 bit packet lengths.
To process UMP in JS I'm currently using arrays of 32bit integers (I should be using Uint32Array).

There is also very specific ways to scale up and scale down values when converting from MIDI 1.0 protocol and MIDI 2.0 UMP packets. These kinds of utilities are also going to be required.

Is there a place where I can follow the progress of MIDI 2.0 support?

I'm developing a software synth that runs on videocard shaders in the browser and I'm very interrested in the higher precission of MIDI-2.0. You can follow my progress on

https://shadersynth.com

I'm hoping to build a community like the one on shadertoy.com (i got inspired by the sound shaders there) but for audio instruments. and effects People could write their own audio shaders, publish them and others could comment on them and like them. For the further future I'm hoping to turn it into a full DAW for music sharing alla audiotool.com. Midi 2.0 support would be awesome to have by then.

As I understand MIDI 2.0, it is layered on top (or at least, layer-able on top) of MIDI 1.0 - features such as Property Exchange are accomplished via sysex exchange. It would be possible to build this implementation in the short term on top of Web MIDI 1.0, then, for features such as higher precision. As Web MIDI is already a low-level API, I don't expect to provide the higher-level semantics of MIDI 2.0 in the near term, but this can be built (IIUC) on top as a component layer, which would be a good lead-in to defining what Web MIDI 2.0 might look like (and native MIDI APIs for v2, for that matter).

MIDI 2.0 is backwards compatible and can transport 1,0 messages. Controllers can query if the other side supports MIDI 2.0 and switch to use the new protocol. Messages get bigger mostly double in size (from 16 bit to 32 bit and from 32 to 64bit) the 7bit limit is removed so the whole byte is usable this gives things like:

  • Velocity gets 16bits instead of 7
  • The number and precission of available controls goes up from 127 controls (NRP and RP multiplexed controls not counted) with 127bit precission to 65,536 controls (under witch the NRP RP controls without multiplexing) with 32 bit precission.
  • There are 256 note attributes which can be changed per note with 16bit precission
  • There is a timestamp in messages (turned on by query?)
  • There are 16 group numbers to give more room to the 16 channels
  • Midi 1.0 can be transported inside such a group

This is what I got from watching this video after I posted the previous comment

https://www.youtube.com/watch?v=Kky1nlwz8-8&ab_channel=TheMIDIAssociationGmail

Some midi 2.0 features can be done over 1.0 but not the higher precission.

If webmidi would support the new messages by adding more data bytes then we could do the rest from software (or is this allready possible?) in javascript. I'm willing to put out an opensource midiparser for handling of the midi 2.0 messages. (I'm planning to opensource the whole of my shadersynth, it's a private repo for now but all code is readable in the browser and I'm working on a new engine for the synth as I see the current version still as a prototype)

Is there a place where I can follow the progress of MIDI 2.0 support?

chromium has it in their Project Fugu queue as a P3.

note that there's been very little work towards developing a web midi 2.0 spec. please consider trying to extend the existing webmidi spec with webmidi support. this could significantly advance the feasibility of web midi 2.0 becoming a thing.

I would be happy with just supporting the new USB endpoint and UMP as a start

As I understand MIDI 2.0, it is layered on top (or at least, layer-able on top) of MIDI 1.0 - features such as Property Exchange are accomplished via sysex exchange. It would be possible to build this implementation in the short term on top of Web MIDI 1.0, then, for features such as higher precision. As Web MIDI is already a low-level API, I don't expect to provide the higher-level semantics of MIDI 2.0 in the near term, but this can be built (IIUC) on top as a component layer, which would be a good lead-in to defining what Web MIDI 2.0 might look like (and native MIDI APIs for v2, for that matter).

I know the comment is ancient, but there's confusion out there around this.

Only the MIDI CI features can be layered on top of MIDI 1.0 byte stream messages. The UMP specification is completely different and requires ground-up support for key things like

  • Bidirectional endpoints for devices as the addressable entities instead of ports (most devices will have a single endpoint)
  • Groups in the messages themselves instead of cables that "become" ports in the operating systems
  • Function Blocks and Group terminal blocks for information about the endpoints, use of the groups, and naming of same
  • In-protocol discovery of and configuration of endpoints and details about them
  • 32, 64, 96, and 128 bit message packets

Windows MIDI Services will ship in-box this year, and supports MIDI 2.0 protocols and transports. Other operating systems have UMP support already in place at one level or another. https://aka.ms/midirepo

Pete

As I understand MIDI 2.0, it is layered on top (or at least, layer-able on top) of MIDI 1.0 - features such as Property Exchange are accomplished via sysex exchange. It would be possible to build this implementation in the short term on top of Web MIDI 1.0, then, for features such as higher precision.

Recently I have indeed created a MIDI-CI implementation on top of Web MIDI API (among many other platforms as Kotlin/Compose Multiplatform works), based on MIDI 1.0. bytestream:
https://androidaudioplugin.web.app/misc/ktmidi-ci-tool-wasm-first-preview/ (sources are at https://github.com/atsushieno/ktmidi/ )

It is not very strict implementation (yet), but supports Profile Configuration, Property Exchange, and Process Inquiry. Its PC and PE interoperability has been tested with JUCE (juce_midi_ci module and CapabilityInquiryDemo) on desktop (using Compose for Desktop), provided it's not perfect. And MIDI 2.0 Workbench to some extent (it does not really work as a configurable MIDI-CI Responder).

The only input it is missing for full MIDI 2.0 support compared to current "MIDI 1.0 only" transport is the context Group number (well, and probably which pair of input/output makes up the MIDI-CI session connection, which I just choose manually so far). Other than that, no platform support was required.

Once the UMP connection is established - the Endpoint Name would be available via UMP Endpoint Name Notification message (the UMP specification section 7.1.1 - 7.1.4). They are part of the "in-protocol" configuration.