/Subtitles

Experiments with HLS video, multiple subtitles & audio tracks on iOS, Android with React Native

Primary LanguageJavaScript

Subtitles

Experiments with HLS video, multiple subtitles & audio tracks on iOS, Android with React Native

HTTP Live Streaming (HLS)

Resources:

HTTP Live Streaming (HLS) - Apple Developer

HTTP Live Streaming: HSL Player for Android | Toptal

H264info.com | Downloads and Information for H.264 Movies

Some example HLS videos from Apple:

Examples - HTTP Live Streaming - Apple Developer

WWDC video (This video is also used in the app and demos):

What’s New in HTTP Live Streaming - WWDC 2016 - Videos - Apple Developer

Useful tools provided by Apple:

Using HTTP Live Streaming

Briefly:

HLS basically uses “playlists” (.m3u8) files to tag various media:

  • Video tracks (video file or playlists of video file fragments)
  • Audio tracks (audio file or playlists of audio file fragments)
  • Text tracks (subtitle file or playlists of subtitle file fragments)

Note: iOS only supports subtitles of type VTT

Platform Support:

Note: The HLS Examples from Apple, didn't work with the web version.

Note: React Native Video already supports as it uses ExoPlayer

Advantages:

  • Multiple streams that “auto switch” based on network connectivity
  • Multiple audio tracks supported
  • Multiple subtitles supported

About this demo

  • Using React Native
  • Works on Android and iOS

Gotchas and Notes

  • Videos took much longer to load on Android (on Simulator)
  • textTracks - Array of subtitles: returns different length on Android and iOS. On Android returns all types on iOS only VTT types.
  • Properties of textTracks array returned on iOS were different - language on iOS was “English”, on Android was “en”
  • On Android (at least on Simulator) - changing the subtitle at runtime - didn’t update the subtitle immediately. Like with the video - changes reflect after some time, probably network related?