Simple audio player for sync / async chunked audio streams.
You can add swift-chunked-audio-player
to an Xcode project by adding it to your project as a package.
If you want to use swift-chunked-audio-player
in a SwiftPM project, it's as
simple as adding it to your Package.swift
:
dependencies: [
.package(url: "https://github.com/mihai8804858/swift-chunked-audio-player", from: "1.0.0")
]
And then adding the product to any target that needs access to the library:
.product(name: "ChunkedAudioPlayer", package: "swift-chunked-audio-player"),
ChunkedAudioPlayer
uses the following approach to stream real time audio:
- Parse
AudioStreamBasicDescription
and split data chunks into audio packets usingAudioFileStreamOpen(_)
andAudioFileStreamParseBytes(_)
- Convert audio packets into
CMSampleBuffer
- Enqueue and play the sample buffers using
AVSampleBufferAudioRenderer
andAVSampleBufferRenderSynchronizer
- Create an instance of
AudioPlayer
:
private let player = AudioPlayer()
- Get the audio data stream (can be either
AsyncThrowableStream
orAnyPublisher
):
let stream = AsyncThrowableStream<Data, Error> = ...
- Start playing the audio stream:
// type parameter is optional, but recommended (if the stream type is known)
player.start(stream, type: kAudioFileMP3Type)
- Listen for changes:
player.$currentState.sink { state in
// handle player state
}.store(in: &bag)
player.$currentRate.sink { rate in
// handle player rate
}.store(in: &bag)
player.$currentDuration.sink { duration in
// handle player duration
}.store(in: &bag)
player.$currentTime.sink { time in
// handle player time
}.store(in: &bag)
player.$currentError.sink { error in
if let error {
// handle player error
}
}.store(in: &bag)
- Control playback:
// Set stream volume
player.volume = 0.5
// Set muted
player.isMuted = true
// Set stream rate
player.rate = 0.5
// Pause current stream
player.pause()
// Resume current stream
player.resume()
// Stop current stream
player.stop()
// Rewind 5 seconds
player.rewind(CMTime(seconds: 5.0, preferredTimescale: 1000))
// Forward 5 seconds
player.forward(CMTime(seconds: 5.0, preferredTimescale: 1000))
// Seek to specific time
player.seek(to: CMTime(seconds: 60, preferredTimescale: 1000))
- SwiftUI Support
AudioPlayer
conforms to ObservableObject
so it can be easily integrated into SwiftUI View
and automatically update the UI when properties change:
struct ContentView: View {
@ObservedObject private var player = AudioPlayer()
var body: some View {
Text("State \(player.currentState)")
Text("Rate \(player.currentRate)")
Text("Time \(player.currentTime)")
Text("Duration \(player.currentDuration)")
if let error = player.currentError {
Text("Error \(error)")
}
}
}
This library is released under the MIT license. See LICENSE for details.