Play HLS, DASH, and future HTTP streaming protocols with video.js, even where they're not natively supported.
Included in video.js 7 by default! See the video.js 7 blog post
Maintenance Status: Stable
Video.js Compatibility: 7.x, 8.x
- Installation
- Contributing
- Troubleshooting
- Talk to us
- Getting Started
- Compatibility
- Documentation
- Options
- How to use
- List
- withCredentials
- useCueTags
- parse708captions
- overrideNative
- playlistExclusionDuration
- maxPlaylistRetries
- bandwidth
- useBandwidthFromLocalStorage
- enableLowInitialPlaylist
- limitRenditionByPlayerDimensions
- useDevicePixelRatio
- customPixelRatio
- allowSeeksWithinUnsafeLiveWindow
- customTagParsers
- customTagMappers
- cacheEncryptionKeys
- handlePartialData
- liveRangeSafeTimeDelta
- useNetworkInformationApi
- useDtsForTimestampOffset
- useForcedSubtitles
- captionServices
- Runtime Properties
- Events
- VHS Usage Events
- In-Band Metadata
- Segment Metadata
- Object as Source
- Options
- Hosting Considerations
- Known Issues and Workarounds
- Testing
- Debugging
- Release History
- Building
- Development
In most cases it is not necessary to separately install http-streaming, as it has been included in the default build of Video.js since version 7.
Only install if you need a specifc combination of video.js and http-streaming versions. If installing separately, use the "core" version of Video.js without the bundled version of http-streaming.
To install videojs-http-streaming
with npm, run
npm install --save @videojs/http-streaming
Select a version of VHS from the CDN
Download a release of videojs-http-streaming
Download a copy of this git repository and then follow the steps in Building
See CONTRIBUTING.md
Drop by the Video.js slack.
This library is included in Video.js 7 by default.
Only if need a specific combination of versions of Video.js and VHS you can get a copy of videojs-http-streaming and include it in your page along with video.js. In this case, you should use the "core" build of Video.js, without a bundled VHS:
<video-js id=vid1 width=600 height=300 class="vjs-default-skin" controls>
<source
src="https://example.com/index.m3u8"
type="application/x-mpegURL">
</video-js>
<!-- "core" version of Video.js -->
<script src="video.core.min.js"></script>
<script src="videojs-http-streaming.min.js"></script>
<script>
var player = videojs('vid1');
player.play();
</script>
Is it recommended to use the <video-js>
element or load a source with player.src(sourceObject)
in order to prevent the video element from playing the source natively where HLS is supported.
The Media Source Extensions API is required for http-streaming to play HLS or MPEG-DASH.
- Chrome
- Firefox
- Internet Explorer 11 Windows 10 or 8.1
These browsers have some level of native HLS support, however by default the overrideNative option is set to true
except on Safari, so MSE playback is used:
- Chrome Android
- Firefox Android
- Edge
- Mac Safari
- iOS Safari
Mac and iPad Safari do have MSE support, but native HLS is recommended
DRM is supported through videojs-contrib-eme. In order to use DRM, include the videojs-contrib-eme plug, initialize it, and add options to either the plugin or the source.
Detailed option information can be found in the videojs-contrib-eme README.
HTTP Live Streaming (HLS) has become a de-facto standard for streaming video on mobile devices thanks to its native support on iOS and Android. There are a number of reasons independent of platform to recommend the format, though:
- Supports (client-driven) adaptive bitrate selection
- Delivered over standard HTTP ports
- Simple, text-based manifest format
- No proprietary streaming servers required
Unfortunately, all the major desktop browsers except for Safari are missing HLS support. That leaves web developers in the unfortunate position of having to maintain alternate renditions of the same video and potentially having to forego HTML-based video entirely to provide the best desktop viewing experience.
This project addresses that situation by providing a polyfill for HLS on browsers that have support for Media Source Extensions. You can deploy a single HLS stream, code against the regular HTML5 video APIs, and create a fast, high-quality video experience across all the big web device categories.
Check out the full documentation for details on how HLS works and advanced configuration. A description of the adaptive switching behavior is available, too.
videojs-http-streaming supports a bunch of HLS features. Here are some highlights:
- video-on-demand and live playback modes
- backup or redundant streams
- mid-segment quality switching
- AES-128 segment encryption
- CEA-608 captions are automatically translated into standard HTML5 caption text tracks
- In-Manifest WebVTT subtitles are automatically translated into standard HTML5 subtitle tracks
- Timed ID3 Metadata is automatically translated into HTML5 metedata text tracks
- Highly customizable adaptive bitrate selection
- Automatic bandwidth tracking
- Cross-domain credentials support with CORS
- Tight integration with video.js and a philosophy of exposing as much as possible with standard HTML APIs
- Stream with multiple audio tracks and switching to those audio tracks (see the docs folder) for info
- Media content in fragmented MP4s instead of the MPEG2-TS container format.
For a more complete list of supported and missing features, refer to this doc.
You may pass in an options object to the hls source handler at player initialization. You can pass in options just like you would for other parts of video.js:
// html5 for html hls
videojs(video, {
html5: {
vhs: {
withCredentials: true
}
}
});
Some options, such as withCredentials
can be passed in to vhs during
player.src
var player = videojs('some-video-id');
player.src({
src: 'https://d2zihajmogu5jn.cloudfront.net/bipbop-advanced/bipbop_16x9_variant.m3u8',
type: 'application/x-mpegURL',
withCredentials: true
});
- Type:
boolean
- can be used as a source option
- can be used as an initialization option
When the withCredentials
property is set to true
, all XHR requests for
manifests and segments would have withCredentials
set to true
as well. This
enables storing and passing cookies from the server that the manifests and
segments live on. This has some implications on CORS because when set, the
Access-Control-Allow-Origin
header cannot be set to *
, also, the response
headers require the addition of Access-Control-Allow-Credentials
header which
is set to true
.
See html5rocks's article
for more info.
- Type:
boolean
- can be used as an initialization option
When the useCueTags
property is set to true,
a text track is created with
label 'ad-cues' and kind 'metadata'. The track is then added to
player.textTracks()
. Changes in active cue may be
tracked by following the Video.js cue points API for text tracks. For example:
let textTracks = player.textTracks();
let cuesTrack;
for (let i = 0; i < textTracks.length; i++) {
if (textTracks[i].label === 'ad-cues') {
cuesTrack = textTracks[i];
}
}
cuesTrack.addEventListener('cuechange', function() {
let activeCues = cuesTrack.activeCues;
for (let i = 0; i < activeCues.length; i++) {
let activeCue = activeCues[i];
console.log('Cue runs from ' + activeCue.startTime +
' to ' + activeCue.endTime);
}
});
- Type:
boolean
- Default:
true
- can be used as an initialization option
When set to false
, 708 captions in the stream are not parsed and will not show up in text track lists or the captions menu.
- Type:
boolean
- can be used as an initialization option
Try to use videojs-http-streaming even on platforms that provide some
level of HLS support natively. There are a number of platforms that
technically play back HLS content but aren't very reliable or are
missing features like CEA-608 captions support. When overrideNative
is true, if the platform supports Media Source Extensions
videojs-http-streaming will take over HLS playback to provide a more
consistent experience.
// via the constructor
var player = videojs('playerId', {
html5: {
vhs: {
overrideNative: true
},
nativeAudioTracks: false,
nativeVideoTracks: false
}
});
Since MSE playback may be desirable on all browsers with some native support other than Safari, overrideNative: !videojs.browser.IS_SAFARI
could be used.
- Type:
number
- can be used as an initialization option
When the playlistExclusionDuration
property is set to a time duration in seconds,
if a playlist is excluded, it will be excluded for a period of that
customized duration. This enables the exclusion duration to be configured
by the user.
- Type:
number
- Default:
Infinity
- can be used as an initialization option
The max number of times that a playlist will retry loading following an error before being indefinitely excluded from the rendition selection algorithm. Note: the number of retry attempts needs to exceed this value before a playlist will be excluded.
- Type:
number
- can be used as an initialization option
When the bandwidth
property is set (bits per second), it will be used in
the calculation for initial playlist selection, before more bandwidth
information is seen by the player.
- Type:
boolean
- can be used as an initialization option
If true, bandwidth
and throughput
values are stored in and retrieved from local
storage on startup (for initial rendition selection). This setting is false
by default.
- Type:
boolean
- can be used as an initialization option
When enableLowInitialPlaylist
is set to true, it will be used to select
the lowest bitrate playlist initially. This helps to decrease playback start time.
This setting is false
by default.
- Type:
boolean
- can be used as an initialization option
When limitRenditionByPlayerDimensions
is set to true, rendition
selection logic will take into account the player size and rendition
resolutions when making a decision.
This setting is true
by default.
- Type:
boolean
- can be used as an initialization option.
If true, this will take the device pixel ratio into account when doing rendition switching. This means that if you have a player with the width of 540px
in a high density display with a device pixel ratio of 2, a rendition of 1080p
will be allowed.
This setting is false
by default.
- Type:
number
- can be used as an initialization option.
If set, this will take the initial player dimensions and multiply it by a custom ratio when the player automatically selects renditions. This means that if you have a player where the dimension is 540p
, with a custom pixel ratio of 2
, a rendition of 1080p
or a lower rendition closest to this value will be chosen. Additionally, if you have a player where the dimension is 540p
, with a custom pixel ratio of 0.5
, a rendition of 270p
or a lower rendition closest to this value will be chosen. When the custom pixel ratio is 0, the lowest available rendition will be selected.
It is worth noting that if the player dimension multiplied by the custom pixel ratio is greater than any available rendition resolution, a rendition will be selected based on bandwidth, and the player dimension will be disregarded.
limitRenditionByPlayerDimensions
must be true
in order for this feature to be enabled. This is the default value.
If useDevicePixelRatio
is set to true
, the custom pixel ratio will be prioritized and overwrite any previous pixel ratio.
- Type:
boolean
- can be used as a source option
When allowSeeksWithinUnsafeLiveWindow
is set to true
, if the active playlist is live
and a seek is made to a time between the safe live point (end of manifest minus three
times the target duration,
see the hls spec
for details) and the end of the playlist, the seek is allowed, rather than corrected to
the safe live point.
This option can help in instances where the live stream's target duration is greater than the segment durations, playback ends up in the unsafe live window, and there are gaps in the content. In this case the player will attempt to seek past the gaps but end up seeking inside of the unsafe range, leading to a correction and seek back into a previously played content.
The property defaults to false
.
- Type:
Array
- can be used as a source option
With customTagParsers
you can pass an array of custom m3u8 tag parser objects. See https://github.com/videojs/m3u8-parser#custom-parsers
- Type:
Array
- can be used as a source option
Similar to customTagParsers
, with customTagMappers
you can pass an array of custom m3u8 tag mapper objects. See https://github.com/videojs/m3u8-parser#custom-parsers
- Type:
boolean
- can be used as a source option
- can be used as an initialization option
This option forces the player to cache AES-128 encryption keys internally instead of requesting the key alongside every segment request.
This option defaults to false
.
- Type:
boolean
, - Default:
false
- Use partial appends in the transmuxer and segment loader
- Type:
number
, - Default:
SAFE_TIME_DELTA
- Allow to re-define length (in seconds) of time delta when you compare current time and the end of the buffered range.
- Type:
boolean
, - Default:
false
- Use window.networkInformation.downlink to estimate the network's bandwidth. Per mdn, The value is never greater than 10 Mbps, as a non-standard anti-fingerprinting measure. Given this, if bandwidth estimates from both the player and networkInfo are >= 10 Mbps, the player will use the larger of the two values as its bandwidth estimate.
- Type:
boolean
, - Default:
false
- Use Decode Timestamp instead of Presentation Timestamp for timestampOffset calculation. This option was introduced to align with DTS-based browsers. This option affects only transmuxed data (eg: transport stream). For more info please check the following issue.
- Type:
boolean
- Default:
false
- can be used as a source option
- can be used as an initialization option
If true, this option allows the player to display forced subtitles. When available, forced subtitles allow to translate foreign language dialogues or images containing foreign language characters.
- Type:
object
- Default: undefined
- Provide extra information, like a label or a language, for instream (608 and 708) captions.
The captionServices options object has properties that map to the caption services. Each property is an object itself that includes several properties, like a label or language.
For 608 captions, the service names are CC1
, CC2
, CC3
, and CC4
. For 708 captions, the service names are SERVICEn
where n
is a digit between 1
and 63
.
For 708 caption services, you may additionally provide an encoding
value that will be used by the transmuxer to decode the captions using an instance of TextDecoder. This is to permit and is required for legacy multi-byte encodings. Please review the TextDecoder
documentation for accepted encoding labels.
{
vhs: {
captionServices: {
[serviceName]: {
language: String, // optional
label: String, // optional
default: boolean, // optional,
encoding: String // optional, 708 services only
}
}
}
}
{
vhs: {
captionServices: {
CC1: {
language: 'en',
label: 'English'
},
SERVICE1: {
langauge: 'kr',
label: 'Korean',
encoding: 'euc-kr'
default: true,
}
}
}
}
Runtime properties are attached to the tech object when HLS is in use. You can get a reference to the VHS source handler like this:
var vhs = player.tech().vhs;
If you were thinking about modifying runtime properties in a video.js plugin, we'd recommend you avoid it. Your plugin won't work with videos that don't use videojs-http-streaming and the best plugins work across all the media types that video.js supports. If you're deploying videojs-http-streaming on your own website and want to make a couple tweaks though, go for it!
Type: object
An object representing the parsed main playlist. If a media playlist is loaded directly, a main playlist with only one entry will be created.
Type: function
A function that can be used to retrieve or modify the currently active media playlist. The active media playlist is referred to when additional video data needs to be downloaded. Calling this function with no arguments returns the parsed playlist object for the active media playlist. Calling this function with a playlist object from the main playlist or a URI string as specified in the main playlist will kick off an asynchronous load of the specified media playlist. Once it has been retreived, it will become the active media playlist.
Type: number
systemBandwidth
is a combination of two serial processes' bitrates. The first
is the network bitrate provided by bandwidth
and the second is the bitrate of
the entire process after that (decryption, transmuxing, and appending) provided
by throughput
. This value is used by the default implementation of selectPlaylist
to select an appropriate bitrate to play.
Since the two process are serial, the overall system bandwidth is given by:
systemBandwidth = 1 / (1 / bandwidth + 1 / throughput)
Type: number
The number of bits downloaded per second in the last segment download.
Before the first video segment has been downloaded, it's hard to estimate bandwidth accurately. The HLS tech uses a starting value of 4194304 or 0.5 MB/s. If you have a more accurate source of bandwidth information, you can override this value as soon as the HLS tech has loaded to provide an initial bandwidth estimate.
Type: number
The number of bits decrypted, transmuxed, and appended per second as a cumulative average across active processing time.
Type: function
A function that returns the media playlist object to use to download
the next segment. It is invoked by the tech immediately before a new
segment is downloaded. You can override this function to provide your
adaptive streaming logic. You must, however, be sure to return a valid
media playlist object that is present in player.tech().vhs.main
.
Overridding this function with your own is very powerful but is overkill for many purposes. Most of the time, you should use the much simpler function below to selectively enable or disable a playlist from the adaptive streaming logic.
Type: function
It is recommended to include the videojs-contrib-quality-levels plugin to your page so that videojs-http-streaming will automatically populate the QualityLevelList exposed on the player by the plugin. You can access this list by calling player.qualityLevels()
. See the videojs-contrib-quality-levels project page for more information on how to use the api.
Example, only enabling representations with a width greater than or equal to 720:
var qualityLevels = player.qualityLevels();
for (var i = 0; i < qualityLevels.length; i++) {
var quality = qualityLevels[i];
if (quality.width >= 720) {
quality.enabled = true;
} else {
quality.enabled = false;
}
}
If including videojs-contrib-quality-levels is not an option, you can use the representations api. To get all of the available representations, call the representations()
method on player.tech().vhs
. This will return a list of plain objects, each with width
, height
, bandwidth
, and id
properties, and an enabled()
method.
player.tech().vhs.representations();
To see whether the representation is enabled or disabled, call its enabled()
method with no arguments. To set whether it is enabled/disabled, call its enabled()
method and pass in a boolean value. Calling <representation>.enabled(true)
will allow the adaptive bitrate algorithm to select the representation while calling <representation>.enabled(false)
will disallow any selection of that representation.
Example, only enabling representations with a width greater than or equal to 720:
player.tech().vhs.representations().forEach(function(rep) {
if (rep.width >= 720) {
rep.enabled(true);
} else {
rep.enabled(false);
}
});
Type: function
The xhr function that is used by VHS internally is exposed on the per-
player vhs
object. While it is possible, we do not recommend replacing
the function with your own implementation. Instead, xhr
provides
the ability to specify onRequest
and onResponse
hooks which each take a
callback function as a parameter, as well as offRequest
and offResponse
functions which can remove a callback function from the onRequest
or
onResponse
Set. An xhr-hooks-ready
event is fired from a player when per-player
hooks are ready to be added or removed. This will ensure player specific hooks are
set prior to any manifest or segment requests.
The onRequest(callback)
function takes a callback
function that will pass an xhr options
Object to that callback. These callbacks are called synchronously, in the order registered
and act as pre-request hooks for modifying the xhr options
Object prior to making a request.
Note: This callback MUST return an options
Object as the xhr
wrapper and each onRequest
hook receives the returned options
as a parameter.
Example:
player.on('xhr-hooks-ready', () => {
const playerRequestHook = (options) => {
return {
uri: 'https://new.options.uri'
};
};
player.tech().vhs.xhr.onRequest(playerRequestHook);
});
If access to the xhr
Object is required prior to the xhr.send
call, an options.beforeSend
callback can be set within an onRequest
callback function that will pass the xhr
Object
as a parameter and will be called immediately prior to xhr.send
.
Example:
player.on('xhr-hooks-ready', () => {
const playerXhrRequestHook = (options) => {
options.beforeSend = (xhr) => {
xhr.setRequestHeader('foo', 'bar');
};
return options;
};
player.tech().vhs.xhr.onRequest(playerXhrRequestHook);
});
The onResponse(callback)
function takes a callback
function that will pass the xhr
request
, error
, and response
Objects to that callback. These callbacks are called
in the order registered and act as post-request hooks for gathering data from the
xhr request
, error
and response
Objects. onResponse
callbacks do not require a
return value, the parameters are passed to each subsequent callback by reference.
Example:
player.on('xhr-hooks-ready', () => {
const playerResponseHook = (request, error, response) => {
const bar = response.headers.foo;
};
player.tech().vhs.xhr.onResponse(playerResponseHook);
});
The offRequest
function takes a callback
function, and will remove that function from
the collection of onRequest
hooks if it exists.
Example:
player.on('xhr-hooks-ready', () => {
player.tech().vhs.xhr.offRequest(playerRequestHook);
});
The offResponse
function takes a callback
function, and will remove that function from
the collection of offResponse
hooks if it exists.
Example:
player.on('xhr-hooks-ready', () => {
player.tech().vhs.xhr.offResponse(playerResponseHook);
});
The global videojs.Vhs
also exposes an xhr
property. Adding onRequest
and/or onResponse
hooks will allow you to intercept the request options and xhr
Object as well as request, error, and response data for all requests in every
player on a page. For consistency across browsers the video source should be set
at runtime once the video player is ready.
Example:
// Global request callback, will affect every player.
const globalRequestHook = (options) => {
return {
uri: 'https://new.options.global.uri'
};
};
videojs.Vhs.xhr.onRequest(globalRequestHook);
// Global request callback defining beforeSend function, will affect every player.
const globalXhrRequestHook = (options) => {
options.beforeSend = (xhr) => {
xhr.setRequestHeader('foo', 'bar');
};
return options;
};
videojs.Vhs.xhr.onRequest(globalXhrRequestHook);
// Global response hook callback, will affect every player.
const globalResponseHook = (request, error, response) => {
const bar = response.headers.foo
};
videojs.Vhs.xhr.onResponse(globalResponseHook);
// Remove a global onRequest callback.
videojs.Vhs.xhr.offRequest(globalRequestHook);
// Remove a global onResponse callback.
videojs.Vhs.xhr.offResponse(globalResponseHook);
For information on the type of options that you can modify see the documentation at https://github.com/Raynos/xhr.
Type: object
This object contains a summary of HLS and player related stats.
Property Name | Type | Description |
---|---|---|
bandwidth | number | Rate of the last segment download in bits/second |
mediaRequests | number | Total number of media segment requests |
mediaRequestsAborted | number | Total number of aborted media segment requests |
mediaRequestsTimedout | number | Total number of timedout media segment requests |
mediaRequestsErrored | number | Total number of errored media segment requests |
mediaTransferDuration | number | Total time spent downloading media segments in milliseconds |
mediaBytesTransferred | number | Total number of content bytes downloaded |
mediaSecondsLoaded | number | Total number of content seconds downloaded |
buffered | array | List of time ranges of content that are in the SourceBuffer |
currentTime | number | The current position of the player |
currentSource | object | The source object. Has the structure {src: 'url', type: 'mimetype'} |
currentTech | string | The name of the tech in use |
duration | number | Duration of the video in seconds |
main | object | The main playlist object |
playerDimensions | object | Contains the width and height of the player |
seekable | array | List of time ranges that the player can seek to |
timestamp | number | Timestamp of when vhs.stats was accessed |
videoPlaybackQuality | object | Media playback quality metrics as specified by the W3C's Media Playback Quality API |
Standard HTML video events are handled by video.js automatically and are triggered on the player object.
Fired after the first segment is downloaded for a playlist. This will not happen
until playback if video.js's metadata
setting is none
Fired when the player xhr
object is ready to set onRequest
and onResponse
hooks, as well
as remove hooks with offRequest
and offResponse
.
Usage tracking events are fired when we detect a certain HLS feature, encoding setting, or API is used. These can be helpful for analytics, and to pinpoint the cause of HLS errors. For instance, if errors are being fired in tandem with a usage event indicating that the player was playing an AES encrypted stream, then we have a possible avenue to explore when debugging the error.
Note that although these usage events are listed below, they may change at any time without a major version change.
VHS usage events are triggered on the tech with the exception of the 3 vhs-reload-error events, which are triggered on the player.
To listen for usage events triggered on the tech, listen for the event type of 'usage'
:
player.on('ready', () => {
player.tech().on('usage', (e) => {
console.log(e.name);
});
});
Note that these events are triggered as soon as a case is encountered, and often only
once. For example, the vhs-demuxed
usage event will be triggered as soon as the main
manifest is downloaded and parsed, and will not be triggered again.
Each of the following usage events are fired once per source if (and when) detected:
Name | Description |
---|---|
vhs-webvtt | main manifest has at least one segmented WebVTT playlist |
vhs-aes | a playlist is AES encrypted |
vhs-fmp4 | a playlist used fMP4 segments |
vhs-demuxed | audio and video are demuxed by default |
vhs-alternate-audio | alternate audio available in the main manifest |
vhs-playlist-cue-tags | a playlist used cue tags (see useCueTags(#usecuetags) for details) |
vhs-bandwidth-from-local-storage | starting bandwidth was retrieved from local storage (see useBandwidthFromLocalStorage(#useBandwidthFromLocalStorage) for details) |
vhs-throughput-from-local-storage | starting throughput was retrieved from local storage (see useBandwidthFromLocalStorage(#useBandwidthFromLocalStorage) for details) |
Each of the following usage events are fired per use:
Name | Description |
---|---|
vhs-gap-skip | player skipped a gap in the buffer |
vhs-player-access | player.vhs was accessed |
vhs-audio-change | a user selected an alternate audio stream |
vhs-rendition-disabled | a rendition was disabled |
vhs-rendition-enabled | a rendition was enabled |
vhs-rendition-excluded | a rendition was excluded |
vhs-timestamp-offset | a timestamp offset was set in HLS (can identify discontinuities) |
vhs-unknown-waiting | the player stopped for an unknown reason and we seeked to current time try to address it |
vhs-live-resync | playback fell off the back of a live playlist and we resynced to the live point |
vhs-video-underflow | we seeked to current time to address video underflow |
vhs-error-reload-initialized | the reloadSourceOnError plugin was initialized |
vhs-error-reload | the reloadSourceOnError plugin reloaded a source |
vhs-error-reload-canceled | an error occurred too soon after the last reload, so we didn't reload again (to prevent error loops) |
The HLS tech supports timed metadata embedded as ID3 tags. When a stream is encountered with embedded metadata, an in-band metadata text track will automatically be created and populated with cues as they are encountered in the stream. UTF-8 encoded TXXX and WXXX ID3 frames are mapped to cue points and their values set as the cue text. Cues are created for all other frame types and the data is attached to the generated cue:
cue.value.data
There are lots of guides and references to using text tracks around the web.
You can get metadata about the segments currently in the buffer by using the segment-metadata
text track. You can get the metadata of the currently rendered segment by looking at the
track's activeCues
array. The metadata will be attached to the cue.value
property and
will have this structure
cue.value = {
byteLength, // The size of the segment in bytes
bandwidth, // The peak bitrate reported by the segment's playlist
resolution, // The resolution reported by the segment's playlist
codecs, // The codecs reported by the segment's playlist
uri, // The Segment uri
timeline, // Timeline of the segment for detecting discontinuities
playlist, // The Playlist uri
start, // Segment start time
end // Segment end time
};
Example: Detect when a change in quality is rendered on screen
let tracks = player.textTracks();
let segmentMetadataTrack;
for (let i = 0; i < tracks.length; i++) {
if (tracks[i].label === 'segment-metadata') {
segmentMetadataTrack = tracks[i];
}
}
let previousPlaylist;
if (segmentMetadataTrack) {
segmentMetadataTrack.on('cuechange', function() {
let activeCue = segmentMetadataTrack.activeCues[0];
if (activeCue) {
if (previousPlaylist !== activeCue.value.playlist) {
console.log('Switched from rendition ' + previousPlaylist +
' to rendition ' + activeCue.value.playlist);
}
previousPlaylist = activeCue.value.playlist;
}
});
}
Note that this is an advanced use-case, and may be more fragile for production environments, as the schema for a VHS object and how it's used internally are not set in stone and may change in future releases.
In normal use, VHS accepts a URL as the source of the video. But VHS also has the ability to accept a JSON object as the source.
Passing a JSON object as the source has many uses. A couple of examples include:
- The manifest has already been downloaded, so there's no need to make another request
- You want to change some aspect of the manifest, e.g., add a segment, without modifying the manifest itself
In order to pass a JSON object as the source, provide a parsed manifest object in via a data URI, and using the "vnd.videojs.vhs+json" media type when setting the source type. For instance:
var player = videojs('some-video-id');
const parser = new M3u8Parser();
parser.push(manifestString);
parser.end();
player.src({
src: `data:application/vnd.videojs.vhs+json,${JSON.stringify(parser.manifest)}`,
type: 'application/vnd.videojs.vhs+json'
});
The manifest object should follow the "VHS manifest object schema" (a somewhat flexible
and informally documented structure) provided in the README of
m3u8-parser and
mpd-parser. This may be referred to in the
project as vhs-json
.
Unlike a native HLS implementation, the HLS tech has to comply with the browser's security policies. That means that all the files that make up the stream must be served from the same domain as the page hosting the video player or from a server that has appropriate CORS headers configured. Easy instructions are available for popular webservers and most CDNs should have no trouble turning CORS on for your account.
Issues that are currenty known. If you want to help find a solution that would be appreciated!
Edge has native support for HLS but only in the MPEG2-TS container. If you attempt to play an HLS stream with fragmented MP4 segments (without overriding native playback), Edge will stall. Fragmented MP4s are only supported on browsers that have Media Source Extensions available.
Some assets which have an audio-only rate and the lowest-bandwidth audio + video rate isn't that low get stuck in audio-only mode. This is because the initial bandwidth calculation thinks it there's insufficient bandwidth for selecting the lowest-quality audio+video playlist, so it picks the only-audio one, which unfortunately locks it to being audio-only forever, preventing a switch to the audio+video playlist when it gets a better estimation of bandwidth.
Until we've implemented a full fix, it is recommended to set the
enableLowInitialPlaylist
option for any assets
that include an audio-only rate; it should always select the lowest-bandwidth
audio+video playlist for its first playlist.
It's also worth mentioning that Apple no longer requires having an audio-only rate; instead, they require a 192kbps audio+video rate (see Apple's current HLS Authoring Specification). Removing the audio-only rate would of course eliminate this problem since there would be only audio+video playlists to choose from.
Follow progress on this in issue #175.
DASH assets which use $Time
in a SegmentTemplate
, and also have a
SegmentTimeline
where only the first S
has a t
and the rest only have a
d
, do not load currently.
There is currently no workaround for this, but you can track progress on this in issue #256.
For testing, you run npm run test
. You will need Chrome and Firefox for running the tests.
videojs-http-streaming uses BrowserStack for compatibility testing.
videojs-http-streaming makes use of videojs.log
for debug logging. You can enable these logs
by setting the log level to debug
using videojs.log.level('debug')
. You can access a complete
history of the logs using videojs.log.history()
. This history is maintained even when the
log level is not set to debug
.
vhs.stats
can also be helpful when debugging. Accessing this object will give you
a snapshot summary of various HLS and player stats. See vhs.stats for details
about what this object contains.
NOTE: The debug
level is only available in video.js v6.6.0+. With earlier versions of
video.js, no debug messages will be logged to console.
Check out the changelog for a summary of each release.
To build a copy of videojs-http-streaming run the following commands
git clone https://github.com/videojs/http-streaming
cd http-streaming
npm i
npm run build
videojs-http-streaming will have created all of the files for using it in a dist folder
- Download stream locally with the HLS Fetcher
- Simulate errors with Murphy
- Inspect content with Thumbcoil
All commands for development are listed in the package.json
file and are run using
npm run <command>