Should 'Atomics.wait' be available in AudioWorklets associated with an OfflineAudioContext?
JohnWeisz opened this issue · 8 comments
Following up on issue WebAudio/web-audio-api#1848, which briefly discusses why a blocking wait like Atomics.wait
is detrimental to real-time audio processing, in particular this comment made by @hoch:
For real-time audio use case, synchronously blocking the thread is generally not a good idea.
... and this comment by @padenot:
one should never ever ever wait in a real-time audio callback, so we disabled Atomic.wait [...] if they are trying to wait on anything, they are doing something wrong
To briefly reflect on these thoughts, I think it should be pointed out that, strictly speaking, none of these apply to offline audio processing, and consequently, I think there is probably no reason whatsoever to disable Atomics.wait
in AudioWorklets associated with an OfflineAudioContext (no reason other than the consistency of available APIs, that is).
In fact, for offline audio rendering that involves cross-thread audio processing, a blocking wait is essentially necessary at one point to guarantee cross-thread processing completion, especially if low or zero latency is necessary, due to how an offline rendering thread schedules its process calls (i.e. as fast as computationally possible, without considering cross-thread audio processing load).
This is currently possible and feasible with a busy wait (and there are plenty of high-level workarounds as well).
Any thoughts on making Atomics.wait
available in offline rendering, to replace busy waiting in these cases?
I'd rather not do this. Differences in API or behavior between the offline and real-time context are problematic.
Why not just compute everything in the worklet? What use-case is this? Parallel processing is not really a supported use case at the minute, but some people have thoughts about it.
In any case, this is v2
territory.
An if
statement can be used for user-defined implementation of wait by checking if the value has been written at N index
if (writeOffset < expectedIndex) {
// do other stuff
// user-defined wait, e.g., set outputs to 0 until writeOffset === expectedIndex; suspend the context
return true
}
// do stuff with expectedIndex set at SharedArrayBuffer
What use-case is this?
Am also interested in the purpose of using Atomics.wait()
?
Are you trying to achieve the following https://github.com/GoogleChromeLabs/web-audio-samples/blob/ae5c30553516814fd690bdf6184d33ea81642ae6/audio-worklet/design-pattern/shared-buffer/shared-buffer-worklet-processor.js#L144 ?
// Now we have enough frames to process. Wake up the worker.
IMO the only way to achieve this is to create OfflineAudioWorkletGlobalScope. I rather don't want to go down that path.
Differences in API or behavior between the offline and real-time context are problematic.
I understand the concern here (hence why I also mentioned this initially), although maybe it might be worth a mention as to how/why it's problematic exactly: not being dev-friendly? Web IDL challenges? If it's one particular problem, is there a straightforward way get around it?
To somewhat extend on the part that says behavior diff between offline and real-time contexts is problematic: the way I see it, this is already the case with how process calls are scheduled, and IMO one could argue it's a fairly fundamental difference.
What use-case is this?
In my case, it's a C++ backed audio processing system in Electron, so I understand this may not exactly be the use case that should weigh in as a factor. Still, maybe it's valuable insight.
PS: I understand there are ways around this problem, but in this particular case, nothing as generally reliable as a blocking wait IMO.
In any event, I don't want to insist on pushing the matter if it really doesn't check out. Even for very exotic use cases like the one I described above, the occasional blocking wait is perfectly usable already (not to mention there are other ways around the problem).
I'm merely interested in whether it would be conceptually reasonable to have Atomics.wait
enabled, but it sounds like it may not be worth it in practice.
I do not gather how Atomics.wait()
helps for the use case?
If you have an input stream of audio you can read the stream in process()
by accessing indexes of the SharedArrayBuffer
, this is possible in "real-time" https://github.com/guest271314/webtransport/blob/main/webTransportAudioWorkletWebAssemblyMemoryGrow.js or by beginning the stream before AudioWorkeltGlobalScope
is created, or by utilizing suspend()
and resume()
to effectively pause process()
execution until conditions are met https://github.com/guest271314/AudioWorkletStream/blob/message-port-post-message/audioWorklet.js#L18.
What would defining Atomics.wait()
in AudioWorkletGlobalScope
provide the capability to do that is not achievable now?
I'm merely interested in whether it would be conceptually reasonable to have Atomics.wait enabled, but it sounds like it may not be worth it in practice.
AFAICT no tests have been performed which substantiate the claims that defining specific methods, e.g., fetch()
, WebAssembly.compileStreaming()
, WebAssembly.initiateStreaming()
, et al. in AudioWorkletGlobalScope
would have adverse performance impacts. We should at least run tests for all of the methods defined in a Worker
(module type and shared) defined in AudioWorkletGlobalScope
to base conclusions on evidence rather than conjecture about what would occur if the methods and API's are defined in the Worklet scope of AudioWorket
AudioWG virtual F2F:
- We really want no differences between Offline and regular.