Equalizer
nanedev opened this issue · 24 comments
Hello @ryanheise,
First of all thank you for this great plugin!
Currently I am working on equalizer for my project and I am using just_audio plugin for audio streaming and for equalizer I am using this lib https://pub.dev/packages/equalizer.
My question is there a way to get audio sesssion id because it is needed to connect current audio with equalizer with current plugin I use, or do you maybe have a suggestion for another way of creating equalizer with just_audio plugin?
Thanks
(Edit by @ryanheise : the Android side is now implemented in the class AndroidEqualizer
. There is a WIP to rewrite the iOS side based on AVAudioEngine and this includes an equalizer for iOS: #784 ).
That's a good question. There isn't currently a way. We discussed in #131 although I haven't implemented it yet.
Whether or not we have two separate plugins or a merged plugin, I can definitely see no harm in providing a getter in just_audio to get the audio session id. I'll look into this when I next have a chance.
I have just added an API to get the current AudioSession ID: AudioPlayer.androidAudioSessionId
and AudioPlayer.androidAudioSessionIdStream
.
The stream is useful because the AudioSession ID can change over time. E.g. It changes whenever the AudioAttributes are changed.
Please try it out on Git master and let me know how it goes.
@rohansohonee speaking of which, on the question of whether this is better as two separate plugins or building it in as a feature of just_audio directly, another deciding factor could be the iOS implementation. Have you taken a look at https://stackoverflow.com/questions/30218449/xcode-auipodeq-augraph to assess the situation? If it looks too difficult to do as a separate plugin, let me know and I guess we may need to look at implementing in just_audio.
After a cursory look at the linked Apple example (https://developer.apple.com/library/ios/samplecode/AudioTapProcessor/Introduction/Intro.html) it seems that the iOS APIs can't simply attach to an audio session by ID and process audio effects independently, they need to be tightly coupled with the audio player. Let me know what you think.
I have had a need help for iOS contributions in the description of the equalizer plugin since I do not own an apple machine and have very little knowledge in Objective-C/Swift. just_audio
supports a lot of platforms (which includes the web), it seems best to have equalizer baked into this plugin. The architecture for implementing equalizer will now be very different from what I have implemented so far, as I have followed android docs and assumed audio session id as the main parameter to add audio effects.
Should I remove my equalizer plugin and have the android code/docs be migrated over to just_audio? How can I contribute?
have very little knowledge of Objective-C/Swift
That sounds like me a year ago :-) Ultimately I had to learn a bit of Objective C to get the iOS implementation off the ground and to build a base that others could then contribute to.
I did spend some further time looking at the link above, and I may have been wrong about the tight coupling. The example is rather complicated, so I haven't completely figured it out yet, but there is a chance it could work perfectly well as an independent plugin.
I just stumbled upon the API needed to implement this on iOS, and it seems that yes, it will probably be easier to do it inside of just_audio.
First, we create an AVMutableAudioMix
and set it in AVPlayerItem.audioMix
. To this audio mix's inputParameters
array we add an instance of AVMutableAudioMixInputParameters
. And on this instance we can access the audioTapProcessor
through which it should be possible to implement an EQ.
This same API should also allow for the implementation of a visualiser.
The implementation technique on iOS will be related to #97 (just putting an issue reference here for later when I come back to look at this.)
Although I haven't implemented this yet, I've now got a working implementation of the visualizer (see the visualizer
branch). This brings me one step closer to being able to write iOS code for an equalizer since it would be written using the same underlying iOS APIs.
Equalizer functionality would be amazingly helpful for my app. I can't help with the code but perhaps I can contribute a bounty to get this feature bumped up the priority list?
Thanks for offering to support development @leidig54 !
Things are a bit up in the air at the moment while the transition to null safety is happening, and stable null safety is supposed to be coming soon. Probably this feature will end up being based on the visualizer branch after the nnbd commits are merged into it.
If this feature is indeed tied to the visualizer feature on the iOS side, then another way people could help move things along is to help with iOS testing on the visualizer branch (#97 ) - in particular, checking for retain cycles.
Sure. Unfortunately I don't have the know-how to check for retain cycles. I'll keep an eye on this thread anyway and pipe back up when it looks like the focus can be on the visualizer etc. Thanks for a great package.
It looks like this feature request is quite popular. Since a number of popular feature requests involve audio processing, I think that it may be time to switch to an AVAudioEngine-based iOS implementation. I have opened #334 to track progress on that front. The iOS equalizer implementation can then be completed after that.
This is not going to be an easy task, so I wouldn't expect results soon. The earliest estimate would be 1-2 months if I weren't distracted by other things (but I will probably have at least one distraction, which is the upcoming release of audio_service 0.18.0.
(Copying this comment from another issue to get broader interest)
Is this supported on iOS currently?
The waveform visualizer is implemented on iOS but not pitch. You can track the pitch feature here: #329
There is a big question at this point whether to continue with the current AVQueuePlayer-based implementation or switch to an AVAudioEngine-based implementation. For pitch scaling, I really want to take advantage of AVAudioEngine's built-in features, but that requires a rewrite of the iOS side - see #334 and this is a MUCH bigger project.
I would really like to see an AVAudioEngine-based solution see the light of day, but it will probably not happen if I work on it alone. If anyone would like to help, maybe we can pull it off with some solid open source teamwork. One of the attractive solutions is to use AudioKit which is a library built on top of AVAudioEngine which also provides access to pitch adjustment AND provides a ready-made API for a visualizer and equalizer. That is, it provides us with everything we need - BUT it is written in Swift and so that involves a language change and it means we may need to deal with complaints that old projects don't compile (we'd need to provide extra instructions on how to update their projects to be Swift-compatible).
Would anyone like to help me with this? (Please reply on #334)
I have just added an API to get the current AudioSession ID:
AudioPlayer.androidAudioSessionId
andAudioPlayer.androidAudioSessionIdStream
.The stream is useful because the AudioSession ID can change over time. E.g. It changes whenever the AudioAttributes are changed.
Please try it out on Git master and let me know how it goes.
Hi AudioPlayer.androidAudioSessionId
and AudioPlayer.androidAudioSessionIdStream
is always null on Android. All permissions like
<uses-permission android:name="android.permission.WAKE_LOCK"/>
<uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
is given.
I have finally completed #398 , a generalised API for audio effects. Initially, I have implemented two Android audio effects: Equalizer
and LoudnessEnhancer
, but this generalised API forms the basis of adding more.
There are not yet any audio effects on the iOS side, this will depend on #334 . The way the audio effects API has been set up is to allow for a different set of supported audio effects per platform so that apps can tap into platform-specific features if they desire. On iOS, if we go ahead with an AudioKit-based implementation, then instead of having something equivalent to an Equalizer
effect, you will have an AKEqualizerFilter
which corresponds to a single band. Eventually, I may be able to provide some cross-platform versions that take the intersection of features on both platforms.
So, the next blocking issue for this is #334 .
The Android implementation is now published in release 0.8.0.
In pub.dev was written, that your lib has equalizer for android, but I didn't understand how to use it.
This lib unsound null safety : https://pub.dev/packages/equalizer
So I don't prefer to use it
@zatovagul at the moment, the best way to understand how to use it is to study the example in the example/lib
directory.
@zatovagul at the moment, the best way to understand how to use it is to study the example in the
example/lib
directory.
Sorry, how can I use this package with visualizer. When I am using the last version, there is no visualizer functionality. But when I am trying to use Ettipats visualizer branch there is no AndroidEqualizer functionality???
Hey any update on this topic?
Yes there has been some progress in #334 with a contributed AVAudioEngine-based implementation, and the good news is that the equalizer has been implemented for iOS.
Wow that's amazing, this is an awesome project! Is already in the current version of the package or need to be installed separately?