superpoweredSDK/web-audio-javascript-webassembly-SDK-interactive-audio

Live analyzing audio input

franzwilding opened this issue · 3 comments

Hey! I'm playing around with the JS SDK to doing a live analysis of audio input, unfortunately I cannot make Superpowered.Analyzer work. Loading the wasm code, initializing an audio node and connecting to the web audio user input is working fine:

Superpowered.value = await SuperpoweredGlue.fetch('/superpowered.wasm');
Superpowered.value.Initialize({ ... });

WebAudioManager.value = new SuperpoweredWebAudio(44100, Superpowered.value);
const audioInputStream = await WebAudioManager.value.getUserMediaForAudioAsync({ 'fastAndTransparentAudio': true });
const audioInput = WebAudioManager.value.audioContext.createMediaStreamSource(audioInputStream);
AudioNode.value = await WebAudioManager.value.createAudioNodeAsync(ProcessorUrl, 'AudioProcessor', (message) => {});
audioInput.connect(AudioNode.value);
WebAudioManager.value.audioContext.resume();

However inside the audio processor, the analyser will not produce any data:

import { SuperpoweredWebAudio } from '../superpowered/SuperpoweredWebAudio.js';

class AudioProcessor extends SuperpoweredWebAudio.AudioWorkletProcessor {

    onReady() {
        this.analyzer = new this.Superpowered.Analyzer(44100, 60);
    }

    processAudio(inputBuffer, outputBuffer, buffersize, parameters) {
        if(this.analyzing) {
            this.analyzer.process(inputBuffer, buffersize, -1);
            this.analyzer.makeResults(...);
            this.sendMessageToMainScope({ command: 'ANALYSE', value: {
                bpm: this.analyzer.bpm,
                notes: this.analyzer.notes,
            } });
        }
    }
}

if (typeof AudioWorkletProcessor === 'function') registerProcessor('AudioProcessor', AudioProcessor);
export default AudioProcessor;

bpm is always 0 and notes undefined.

Thank you for your help!

Every call to processAudio represents very very short audio data, just a couple of milliseconds. The Analyzer class was not designed for live analysing audio, the LiveAnalyzer class can only do that, but it's not available for the web yet.
The Analyzer class collects audio data with its process method, then you call makeResults once and its done.

Thank you for the quick response!

So doing something like semi-live note analysis for playing-along features is not working for the web at the moment?
Do you plan to bring this feature to the JS SDK anytime soon?

Our LiveAnalyzer class detects the average tempo over a longer period of time, it can't be used for "note analysis" even if the web SDK would contain it.