ryanheise/just_audio

Generalised API for audio effects

ryanheise opened this issue · 6 comments

Is your feature request related to a problem? Please describe.

The equalizer and volume boost features are two examples of audio effects that change the audio signal in real time. I would anticipate other audio effects being added in the future, such as reverb. Rather than polluting the top level AudioPlayer API with individual methods to enable each of these individual effects, it could be more elegant to have a generalised API for audio effects.

Describe the solution you'd like

An AudioEffect class with a subclass for each type of effect.

Describe alternatives you've considered

Polluting AudioPlayer with methods to enable and disable each effect.

Additional context

N/A

The iOS side will depend on #334 .

Proposed API:

abstract class AudioEffect {
  int get id;
  bool get enabled;
  Future<void> setEnabled(bool enabled);
}

And we could provide a method in AudioPlayer to set a list of effects to be applied (each of which can later by enabled/disabled dynamically).

On Android, it seems there is no way to order the application of the effects, while on iOS it appears they are linked together with the output of one feeding into the input of another. So a unified API that will work for iOS and Android can be just a list of effects, which is interpreted more like an unordered set on Android. I can't imagine these effects being confluent, so maybe some experiment could be done to test whether registering or enabling the audio effects in different orders will result in different output, or whether Android has a hardcoded order in its audio processing pipeline.

Edit: The following documentation seems to hint that the audio effects pipeline might be hardcoded on Android:

Creating an AudioEffect object will create the corresponding effect engine in the audio framework if no instance of the same effect type exists in the specified audio session. If one exists, this instance will be used.

Edit 2: Looking at the source suggests that effects are applied in the order of creation, but duplicate effects are ignored:

https://www.programmersought.com/article/99361763687/

I'm settling on this design: you pass an ordered set of audio effects into the AudioPlayer constructor which establishes the audio processing pipeline. All audio effects are initially disabled by default, i.e. they will be in the pipeline but bypassed. You can enable/disable them on demand by calling a setter on each effect. E.g.

_player = AudioPlayer(audioEffects: {effect1, effect2, effect3});
await effect1.setEnabled(true);

Now, the native resources for each audio effect will be allocated in load and deallocated in stop or dispose.

Maybe in the future people will want finer control over when the native resources are allocated and deallocated but I think this should suffice to start with.

Now it's looking more like this:

_player = AudioPlayer(
  audioPipeline: AudioPipeline(
    androidAudioEffects: [effect1, effect2],
    darwinAudioEffects: [effect3, effect4, effect5],
  ),
);

So in essence, each platform will have a different audio effects pipeline and that is because ultimately each platform offers different native audio effects.

There may be a subset of functionality that intersects on both platforms and there could be helper methods to for example create a cross-platform equalizer, while still exposing the native audio effects for flexibility.

This has now been committed on the dev branch. Currently two Android effects are supported: LoudnessEnhancer and Equalizer. There are no audio effects on iOS yet, that will need to wait until #334 .

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs, or use StackOverflow if you need help with just_audio.