/StreamPack

Live streaming for RTMP and Secure Reliable Transport (SRT) on Android

Primary LanguageKotlinApache License 2.0Apache-2.0

StreamPack: RTMP and SRT live streaming SDK for Android

StreamPack is a modular live streaming library for Android focusing on audio/video quality and a great developer experience. It is designed to be used in live streaming apps and games.

Simplify your live streaming experience.

Setup

Get StreamPack core latest artifacts on mavenCentral:

dependencies {
    implementation 'io.github.thibaultbee:streampack:2.5.2'
}

If you want to use RTMP, you need to add the following dependency:

dependencies {
    implementation 'io.github.thibaultbee:streampack-extension-rtmp:2.5.2'
}

If you want to use SRT, you need to add the following dependency:

dependencies {
    implementation 'io.github.thibaultbee:streampack-extension-srt:2.5.2'
}

If you use both RTMP and SRT, you might have a conflict with libssl.so and libcrypto.so because they are both includes in native dependencies. To solve this, you can add in your build.gradle:

android {
    packagingOptions {
        pickFirst '**/*.so'
    }
}

Features

  • Video:
    • Source: Cameras or Screen recorder
    • Orientation: portrait or landscape
    • Codec: HEVC/H.265 or AVC/H.264
    • Configurable bitrate, resolution, framerate (tested up to 60), encoder level, encoder profile
    • Video only mode
    • Device video capabilities
  • Audio:
    • Codec: AAC-LC or Opus
    • Configurable bitrate, sample rate, stereo/mono, data format
    • Processing: Noise suppressor or echo cancellation
    • Audio only mode
    • Device audio capabilities
  • Network: RTMP/RTMPS or SRT
    • Ultra low-latency based on SRT
    • Network adaptive bitrate mechanism for SRT

Samples

Camera and audio sample

For source code example on how to use camera and audio streamers, check the sample app directory. On first launch, you will have to set RTMP url or SRT server IP in the settings menu.

Screen recorder

For source code example on how to use screen recorder streamer, check the sample screen recorder directory . On first launch, you will have to set RTMP url or SRT server IP in the settings menu.

Tests with a FFmpeg server

FFmpeg has been used as an SRT server+demuxer+decoder for the tests.

RTMP

Tells FFplay to listen on IP 0.0.0.0 and port 1935.

ffplay -listen 1 -i rtmp://0.0.0.0:1935/s/streamKey

On StreamPack sample app settings, set Endpoint -> Type to Stream to a remove RTMP device, then set the server URL to rtmp://serverip:1935/s/streamKey. At this point, StreamPack sample app should successfully sends audio and video frames. On FFplay side, you should be able to watch this live stream.

SRT

Check how to build FFmpeg with libsrt in SRT CookBook. Tells FFplay to listen on IP 0.0.0.0 and port 9998:

ffplay -fflags nobuffer srt://0.0.0.0:9998?mode=listener

On StreamPack sample app settings, set the server IP to your server IP and server Port to 9998 . At this point, StreamPack sample app should successfully sends audio and video frames. On FFplay side, you should be able to watch this live stream.

Quick start

If you want to create a new application, you should use the template StreamPack boilerplate. In 5 minutes, you will be able to stream live video to your server.

  1. Adds permissions to your AndroidManifest.xml and request them in your Activity/Fragment.

  2. Creates a SurfaceView to display camera preview in your layout

As a camera preview, you can use a SurfaceView, a TextureView or any View where that can provide a Surface.

To simplify integration, StreamPack provides an StreamerSurfaceView.

<layout>
    <io.github.thibaultbee.streampack.views.StreamerSurfaceView
        android:id="@+id/preview"
        android:layout_width="match_parent" 
        android:layout_height="match_parent"
        app:cameraFacingDirection="back"
        app:enableZoomOnPinch="true" />
</layout>

app:cameraFacingDirection can be back to start preview on the first back camera or front to start preview on the first front camera. app:enableZoomOnPinch is a boolean to enable zoom on pinch gesture.

  1. Instantiates the streamer (main live streaming class)
val streamer = CameraSrtLiveStreamer(context = requireContext())
  1. Configures
val audioConfig = AudioConfig(
    startBitrate = 128000,
    sampleRate = 44100,
    channelConfig = AudioFormat.CHANNEL_IN_STEREO
)

val videoConfig = VideoConfig(
    startBitrate = 2000000, // 2 Mb/s
    resolution = Size(1280, 720),
    fps = 30
)

streamer.configure(audioConfig, videoConfig)
  1. Inflate the camera preview with the streamer
/**
 * preview: where to display preview. Its could be a SurfaceView, a TextureView,...
 */
preview.streamer = streamer
  1. Starts the live streaming
streamer.startStream(ip, port)
  1. Stops and releases
streamer.stopStream()
streamer.disconnect()
streamer.stopPreview() // The StreamerSurfaceView will be automatically stop the preview
streamer.release()

For more detailed explanation, check out the API documentation.

Permissions

You need to add the following permissions in your AndroidManifest.xml:

<manifest>
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAMERA" />\
    <uses-permission android:name="android.permission.INTERNET" />
    <!-- Application requires android.permission.WRITE_EXTERNAL_STORAGE only for IFileStreamer implementation` -->
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
</manifest>

Your application also has to request the following dangerous permission: android.permission.RECORD_AUDIO, android.permission.CAMERA and android.permission.WRITE_EXTERNAL_STORAGE (only for only for IFileStreamer implementation).

For the PlayStore, your application might declare this in its AndroidManifest.xml

<manifest>
    <uses-feature android:name="android.hardware.camera" android:required="true" />
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />
</manifest>

Tips

RTMP or SRT

RTMP and SRT are both live streaming protocols. SRT is a UDP-based modern protocol, it is reliable and ultra low latency. RTMP is a TCP-based protocol, it is also reliable but it is only low latency. There are already a lot of comparison over the Internet, so here is a summary: SRT:

  • Ultra low latency (< 1s)
  • HEVC support through MPEG-TS RTMP:
  • Low latency (2-3s)
  • HEVC not officially support (specification has been aban by its creator)

So, the main question is: "which protocol to use?" It is easy: if your server has SRT support, use SRT otherwise use RTMP.

Streamers

Let's start with some definitions! Streamers are classes that represent a live streaming pipeline: capture, encode, mux and send. They comes in multiple flavours: with different audio and video source, with different endpoints and functionalities... 3 types of base streamers are available:

  • CameraStreamers: for streaming from camera
  • ScreenRecorderStreamers: for streaming from screen
  • AudioOnlyStreamers: for streaming audio only

You can find specific streamers for File or for Live. Currently, there are 2 main endpoints:

  • FileStreamer: for streaming to file
  • LiveStreamer: for streaming to a RTMP or a SRT live streaming server

For example, you can use AudioOnlyFlvFileStreamer to stream from microphone only to a FLV file. Another example, you can use CameraRtmpLiveStreamer to stream from camera to a RTMP server.

If a streamer is missing, of course, you can also create your own. You should definitely submit it in a pull request.

Get device capabilities

Have you ever wonder: "What are the supported resolution of my cameras?" or "What is the supported sample rate of my audio codecs?"? Helpers classes are made for this. All Streamer comes with a specific Helper object (I am starting to have the feeling I repeat myself):

val helper = streamer.helper

Get extended settings

If you are looking for more settings on streamer, like the exposure compensation of your camera, you must have a look on Settings class. All together: "All Streamer comes with a specific Settings object":

streamer.settings

For example, if you want to change the exposure compensation of your camera, on a CameraStreamers you can do it like this:

streamer.settings.camera.exposure.compensation = value

Moreover you can check exposure range and step with:

streamer.settings.camera.exposure.availableCompensationRange
streamer.settings.camera.exposure.availableCompensationStep

Screen recorder Service

To record the screen, you have to use one of the ScreenRecorderStreamers inside an Android Service. To simplify this integration, StreamPack provides several ScreenRecorderService classes. Extends one of these class and overrides onNotification to customise the notification.

Android SDK version

Even if StreamPack sdk supports a minSdkVersion 21. I strongly recommend to set the minSdkVersion of your application to a higher version (the highest is the best!) for higher performance.

Licence

Copyright 2021 Thibault B.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.