Why AudioWorklet disables Atomics.wait?
Closed this issue ยท 9 comments
I didn't test on other browser but only Chrome and it might only matter on Chrome.
As AudioWorklet SharedArrayBuffer model example shows, a separate Worker
thread produce audio data and AudioWorklet
thread consumes the produced data continuously.
But this sample code has race condition issue when producing audio data on Worker
thread takes heavy time for some reason and Worklet
thread tries to consume before the data is produced.
Which means Worklet
should check whether data is ready or not. Two possible solutions are (as written on here
- Send message when output buffer is ready via MessagePort
- Block AWP until output buffer is ready (Atomics.compareExchange or spin-lock)
But both of them seems to have a problem.
- Main UI thread must get involved as a relay in message passing between AWP and DWG. As a result, there might be a delay issue when UI thread is kinda busy.
- Performance issue
Glitching is not a problem in here since glitching has already happened when consumer suffers lack of data. Rather than, this such unnatural solution to solve this situation makes me feel bad. If Atomics.wait
can be used in Worklet
thread, it can just wait using Atomics.wait
until data is ready and everything will looks fine.
Or, is there any technical / design issue which prevents Atomics.wait
to be enabled on AudioWorklet
?
By design, AudioWorkletProcessor must be synchronously processed by "rendering thread". If the user code in the processor blocks the thread (like Atomics.wait
), it will stall the entire graph rendering loop affecting the other processors' progress. That's why AudioWorkletGlobalScope does not have Atomics.wait
.
For real-time audio use case, synchronously blocking the thread is generally not a good idea. I'll update my example with a better shared memory model.
http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing is the classic reference for this kind of question, but the bottom line is: one should never ever ever wait in a real-time audio callback, so we disabled Atomic.wait
, because there is no good reason to have it, and not having it clearly sends a message to authors, they if they are trying to wait on anything, they are doing something wrong (of course they can busy-wait, etc., it's not foolproof).
Does this also apply to the WASM instruction memory.atomic.wait
? Then it might be impossible/impractical to use muti-threaded code in a worklet at all. Even the compilation of an empty Rust program emits those commands an I'd guess a lot of library code (Mutex, Memory management, โฆ) will do this as well.
Speaking for me: I'd rather experience an audible glitch than having my program crash randomly.
Is there a work around for this? My audio worklet will crash periodically complaining that Atomics.wait: cannot be called in this context
. I'm not calling atomics directly, I'm using a library called futures-channel
to send events from the main thread to the audio worklet.
There is no work around, it is very much intended. Your library isn't thread safe and shouldn't be used in an AudioWorkletGlobalScope
.
Technically, as far as your calling code is concerned, Atomics.wait(array, offset, value)
is basically equivalent to a busy wait (or spinlock) from a code execution flow POV, so you can use something like this to polyfill it:
while (Atomics.load(array, offset) !== value); // TODO: handle timeout if you need it
However, you typically don't want to do this in a real-time AudioContext. For real-time audio processing, there is almost always a better approach than a blocking wait on the audio thread. It's fine for an OfflineAudioContext though (if you know what you are doing, and are familiar the performance characteristics of busy waiting vs. properly waiting through the OS thread scheduler).
https://timur.audio/using-locks-in-real-time-audio-processing-safely is a good read on a mix of audio processing and multithreading.
Edit: the article linked earlier is also a good explanation, and I agree with @padenot that futures-channel
probably shouldn't be used in a real-time AudioContext like this.
Your library isn't thread safe and shouldn't be used in an AudioWorkletGlobalScope.
Not my library @padenot , but good to know. Also, thanks @JohnWeisz for the article. It feels like it's really hard find me to find literature on this topic.
http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing contains 99% of what you need to know, but for most intents and purposes it can build down to do: do not do anything in your callback that can has a duration you don't control.
If / when you're a real pro, like Timur is (linked above), then you can start doing fancy stuff and break rules.
If I may offer, unsolicited, some perspective from a less experienced DSP developer.
I understand & appreciate the advice of that article @padenot . However, making the audio thread wait-free might not always be the first priority when getting started with DSP development, and just trying to make something run. I would guess that many people that run into the Atomic.wait
issue come from compiling their code to native first, then trying to port to the web, then noticing that they need a major refactoring before they can do that.
Of course, I agree that they should do that. But if they didn't have to do it, it could make porting native code to the web significantly less painful. It would allow them to refactor to wait-less only once they're ready - just like in the native case. ๐
I also think that the mindset: "We know what you want, and it's not that, so we made it impossible" can be slightly dangerous, as there might still be cases where it's either needed or more practical do it regardless, due to different priorities of the project, or due to a technical reason somewhere in the techstack.
Just my 2ยข, no offense โค