Experiments with HLS video, multiple subtitles & audio tracks on iOS, Android with React Native
HTTP Live Streaming (HLS) - Apple Developer
HTTP Live Streaming: HSL Player for Android | Toptal
H264info.com | Downloads and Information for H.264 Movies
Some example HLS videos from Apple:
Examples - HTTP Live Streaming - Apple Developer
WWDC video (This video is also used in the app and demos):
What’s New in HTTP Live Streaming - WWDC 2016 - Videos - Apple Developer
Useful tools provided by Apple:
- Video tracks (video file or playlists of video file fragments)
- Audio tracks (audio file or playlists of audio file fragments)
- Text tracks (subtitle file or playlists of subtitle file fragments)
Note: iOS only supports subtitles of type VTT
- iOS - Natively supported
- Android - Supported using ExoPlayer
- Web - HLS.js seems to be popular. Check this DEMO
Note: The HLS Examples from Apple, didn't work with the web version.
Note: React Native Video already supports as it uses ExoPlayer
- Multiple streams that “auto switch” based on network connectivity
- Multiple audio tracks supported
- Multiple subtitles supported
- Using React Native
- Works on Android and iOS
- Videos took much longer to load on Android (on Simulator)
textTracks
- Array of subtitles: returns different length on Android and iOS. On Android returns all types on iOS only VTT types.- Properties of
textTracks
array returned on iOS were different - language on iOS was “English”, on Android was “en” - On Android (at least on Simulator) - changing the subtitle at runtime - didn’t update the subtitle immediately. Like with the video - changes reflect after some time, probably network related?