Using the Media Player Library

The Google Cast Media Player Library, described in the Media Player Library API reference, makes it easy for you to build rich media experiences for your users. It provides JavaScript support for parsing manifests and playing HTTP Live Streaming (HLS), MPEG-DASH, and Smooth Streaming content. It also provide support for HLS AES encryption, PlayReady DRM, and Widevine DRM.

Your receiver app accesses the Media Player Library with the following reference:


In production applications, you should not specify the protocol when sourcing the Media Player Library, so as to allow the resource to be loaded with the protocol that the host server uses, which must be HTTPS. While testing you may use HTTP. However, if you are implementing AES-128 encrypted HLS media streams, you must specify that the Media Player Library be sourced over HTTPS, even during testing.

Using Cross-Origin Resource Sharing (CORS)

Google Cast fully supports Cross-Origin Resource Sharing. Streaming protocols, unlike most file based protocols, access content in an asynchronous way using XMLHTTPRequest. In a CORS world, these requests are guarded against inappropriate access by the CORS header from the server where the resource originates. This means that your content's server has a say on where it may be included. Most modern browsers fully support CORS. iOS and Android devices access the content at a lower level and do not look at these headers. This is often the first issue that comes up when a developer wishes to use streaming content.

Many servers are configured to require CORS, and consequently any added assets need CORS. The assets that may need CORS headers include manifests, secondary manifests, segments, and crypto keys.

To setup your CORS headers, give your media server all the addresses of the servers where you host your receiver application, (development, staging, and external servers).

For example, to provide headers for files stored on Google Cloud Storage you might use the following permissive cors.json file.

    "origin": ["*"],
    "responseHeader": ["Content-Type"],
    "method": ["GET", "HEAD"],
    "maxAgeSeconds": 86400

Note, a permissive CORS header like this is unlikely suitable for production. Typically, you would replace the '*' (in "origin") with the URL of your receiver.

To set the example header above, use the gsutil command:

$ gsutil cors set cors.json gs://location
Setting CORS on gs://location/...

CORS must be enabled on any website(s) where you host media content, including your CDN. The website, Enable-CORS provides examples for many other servers including Apache.

Loading adaptive bitrate media streams

For the adaptive bitrate streaming protocols, where you must implement CORS, the recommended way to pass the URL is to include the mime type in the contentType property of the LOAD command, as follows:

  • HLS: application/x-mpegurl or application/
  • DASH: application/dash+xml
  • Smooth Streaming: application/

Demonstration code

The demonstration code provided below can be used as-is. You just need to do the following:

  • Host it on your server.
  • Enable CORS for your content on your server (manifests, sub-manifests, and segments).
  • Connect your sender application to your receiver.
  • Issue a Load command with the URL.

This code currently implements a trivial parser to figure out the content type based on the extension of the file. It is for example only; you should implement your own mechanism to do this.

  • HLS: .m3u8
  • DASH: .mpd
  • SmoothStreaming: .ism

You can, of course, modify the code, if you are serving something different.

Host, player, and protocol

The cast.player.api.Player listens to the provided media element and media source events. A Player instance funnels the data from a Host source to a media element. The Host is a cast.player.api.Host object that you instantiate with a media element and a URL to the media. The Player instance can load a protocol of the type supported based on the manifest of the media content.

Creating a Host object

Create a default host object constructor with your URL and MediaElement as follows:

var host = new cast.player.api.Host({'mediaElement':mediaElement, 'url':url});

If your server requires CORS and cookie information in order to access the media, set the property withCredentials to true and set the header information as needed.

host.updateSegmentRequestInfo = function(requestInfo) {
  // example of setting CORS withCredentials
  requestInfo.withCredentials = true;
  // example of setting headers
  requestInfo.headers = {};
  requestInfo.headers['content-type'] = 'text/xml;charset=utf-8';

It is usually a good idea to have some error handling or reporting when there is problem with the host. Thankfully, all you need to do is add callbacks to the host object. Also, override any desired methods by replacing them in the host.

host.onError = function(errorCode) {
  console.log("Fatal Error - " + errorCode);
  if (window.player) {
    window.player = null;

Creating a Player object

A Player instance is then created by calling the constructor function and passing in the Host object.

// Create a Player
window.player = new cast.player.api.Player(host);

Before a Player can actually begin playback of the media it needs to know how the media data will be provided. This is accomplished by creating and providing the appropriate StreamingProtocol object when asking the Player instance to load the media data. There are three provided functions within the Cast Media Player Library for creating the appropriate adaptive bit rate protocol object.

Creating a Protocol object

Create the protocol handler as follows:

protocol = cast.player.api.CreateHlsStreamingProtocol(host);


protocol = cast.player.api.CreateDashStreamingProtocol(host);


protocol = cast.player.api.CreateSmoothStreamingProtocol(host);

A mechanism for implementing your own streaming protocol is also planned.

Now, with the protocol in hand, you can tell the Player object that it should load the media using the provided protocol and start at a specific position (typically 0).

window.player.load(protocol, initStart);

Here's the full code example:

<video id='vid' />
<script type="text/javascript"
<script type="text/javascript"

<script type="text/javascript">
// If you set ?Debug=true in the URL, such as a different App ID in the
// developer console, include debugging information.
if (window.location.href.indexOf('Debug=true') != -1) {

var mediaElement = document.getElementById('vid');

// Create the media manager. This will handle all media messages by default.
window.mediaManager = new cast.receiver.MediaManager(mediaElement);

// Remember the default value for the Receiver onLoad, so this sample can Play
// non-adaptive media as well.
window.defaultOnLoad = mediaManager.onLoad.bind(mediaManager)

// Be careful that 'this' points to the mediaManager object and not the window object.
// For example, if you're not using streaming media, the following line:
// window.defaultOnLoad = mediaManager.onLoad;
// Can return an exception similar to the following:
// Uncaught TypeError: Cannot read property 'load' of undefined

mediaManager.onLoad = function (event) {
// The Media Player Library requires that you call player unload between
// different invocations.
  if (window.player !== null) {
    player.unload();    // Must unload before starting again.
    window.player = null;
// This trivial parser is by no means best practice, it shows how to access
// event data, and uses the a string search of the suffix, rather than looking
// at the MIME type which would be better.  In practice, you will know what
// content you are serving while writing your player.
  if (['media'] &&['media']['contentId']) {
    console.log('Starting media application');
    var url =['media']['contentId'];
// Create the Host - much of your interaction with the library uses the Host and
// methods you provide to it. = new cast.player.api.Host(
      {'mediaElement':mediaElement, 'url':url});
    var ext = url.substring(url.lastIndexOf('.'), url.length);
    var initStart =['media']['currentTime'] || 0;
    var autoplay =['autoplay'] || true;
    var protocol = null;
    mediaElement.autoplay = autoplay;  // Make sure autoplay get's set
    if (url.lastIndexOf('.m3u8') >= 0) {
// HTTP Live Streaming
      protocol = cast.player.api.CreateHlsStreamingProtocol(host);
    } else if (url.lastIndexOf('.mpd') >= 0) {
      protocol = cast.player.api.CreateDashStreamingProtocol(host);
    } else if (url.indexOf('.ism/') >= 0) {
// Smooth Streaming
      protocol = cast.player.api.CreateSmoothStreamingProtocol(host);
// How to override a method in Host. I know that it's safe to just provide this
// method.
    host.onError = function(errorCode) {
      console.log("Fatal Error - " + errorCode);
      if (window.player) {
        window.player = null;
// If you need cookies, then set withCredentials = true also set any header
// information you need.  If you don't need them, there can be some unexpected
// effects by setting this value.
//      host.updateSegmentRequestInfo = function(requestInfo) {
//        requestInfo.withCredentials = true;
//      };
    console.log("we have protocol " + ext);
    if (protocol !== null) {
      console.log("Starting Media Player Library");
      window.player = new cast.player.api.Player(host);
      window.player.load(protocol, initStart);
    else {
      window.defaultOnLoad(event);    // do the default process
window.player = null;
console.log('Application is ready, starting system');
window.castReceiverManager = cast.receiver.CastReceiverManager.getInstance();

See the Media Player API Reference for additional classes and methods.

Cycling through audio streams

Here's some code for cycling through audio streams.

window.changeLanguage = function() {
 var currentLanguage = null;
 var streamCount = this.protocol_.getStreamCount();
 var streamInfo;
 for (var i = 0; i < streamCount; i++) {
   if (protocol.isStreamEnabled(i)) {
     streamInfo = protocol.getStreamInfo(i);
     if (streamInfo.mimeType.indexOf('audio') === 0) {
       if (streamInfo.language) {
         currentLanguage = i;

 if (currentLanguage === null) {
   currentLanguage = 0;

 i = currentLanguage + 1;
 while (i !== currentLanguage) {
   if (i === streamCount) {
     i = 0;

   streamInfo = protocol.getStreamInfo(i);
   if (streamInfo.mimeType.indexOf('audio') === 0) {
     protocol.enableStream(i, true);
     protocol.enableStream(currentLanguage, false);


 if (i !== currentLanguage) {

Closed captioning

There are several ways to enable closed-captioning in your receiver.


Use TTML - Timed Text Markup Language with cues covering duration of the video.

To enable:

player.enableCaptions(true, 'ttml', url)

To disable:

player.enableCaptions(false, 'ttml')

By default, the specified TTML position is respected. To ignore the specified position (that is, to always center the captions regardless of what is specified), add the ignoreTtmlPositionInfo configuration property when you create the host, like this:

new cast.player.api.Host({mediaElement, url: url,
                                        ignoreTtmlPositionInfo: true})


Use WebVTT - Web Video Text Tracks with cues covering duration of the video.

Add a track element to HTML and set the src property as described in the following documents:

When the host parses WebVTT, by default it uses absolute cue timestamps. To use relative cue timestamps, add the useRelativeCueTimestamps configuration property when you create the host, like this:

new cast.player.api.Host({mediaElement, url: url,
                                        useRelativeCueTimestamps: true})

Segmented TTML & WebVTT

Use Segmented TTML for Smooth Streaming and WebVTT - Web Video Text Tracks for HLS.

To enable:

protocol_.enableStream(streamIndex, true);

To disable:

protocol_.enableStream(streamIndex, false);

Cycling through caption streams

Here is some sample code to cycle through the caption streams with segmented TTML for Smooth Streaming or WebVTT for HLS.

sample.App.prototype.changeCaptions = function() {
  var current, next;
  var streamCount = this.protocol_.getStreamCount();
  var streamInfo;
  for (current = 0; current < streamCount; current++) {
    if (this.protocol_.isStreamEnabled(current)) {
      streamInfo = this.protocol_.getStreamInfo(current);
      if (streamInfo.mimeType.indexOf('text') === 0) {

  if (current === streamCount) {
    next = 0;
  } else {
    next = current + 1;

  while (next !== current) {
    if (next === streamCount) {
      next = 0;

    streamInfo = this.protocol_.getStreamInfo(next);
    if (streamInfo.mimeType.indexOf('text') === 0) {


  if (next !== current) {
    if (current !== streamCount) {
      this.protocol_.enableStream(current, false);

    if (next !== streamCount) {
      this.protocol_.enableStream(next, true);

In-Band Caption Streams

Some caption streams are transmitted "in-band", that is, in the same stream as audio and video content. In order to detect and respond to these captions, callback function onCue (in the Host class) is triggered right before any caption is added to the screen. This enables apps to respond to the presence of in-band caption streams, such as CEA-608. The callback can signal when to update to display the closed captioning UI in the sender or receiver. Then, when the user enables or disables closed captioning, the app can use enableCaptions to correspondingly toggle closed captioning on or off.

More specifically, your code can do the following:

  1. Start with a default CSS set that hides closed captions.
  2. Load the audio/video stream.
  3. Enable in-band captions:
    player.enableCaptions(true, cast.player.api.CaptionsType.CEA608);
  4. Wait for onCue to be triggered, then disable closed captions in Player in the receiver, signal the presence of closed caption to senders, and remove the CSS hide rules.
  5. From this point forward, closed captions could be triggered from the sender side, just as they would be with separate TTML, WebVTT tracks.

The onCue callback can also be used with non-in-band tracks.

Styling Captions

Captions can be styled using video::cue { /** style here, applies to all cues **/ } in your css.

The media tracks are defined in the media load request. With that request, in the provided media info object, you specify the text track style. See the following load request APIs:

Reference Receiver sample

The Google Cast open source Reference Receiver on GitHub demonstrates key features of the Media Player Library. The Reference Receiver supports MPEG-DASH, SmoothStreaming and HLS adaptive streams. The receiver also supports closed captions and DRM.

Set video/audio bitrates

Override method getQualityLevel:

  host.getQualityLevel = function(streamIndex, qualityLevel) {
    if (streamIndex == videoStreamIndex && videoQualityIndex != -1) {
      return videoQualityIndex;
    } else if (streamIndex == audioStreamIndex &&
        audioQualityIndex != -1) {
      return audioQualityIndex;
    } else {
      return qualityLevel;

DRM (PlayReady/Widevine)

Set the license server URL.


host.licenseUrl = licenseUrl;

Set license custom data.


host.licenseCustomData = customData;

Frequently Asked Questions

Despite whatever URL I pass to the cast device, I don't get any video up. When I play the videos on my local machine they play.
If you're having problems playing streams on a Cast device, it may be an issue with CORS. Use a publicly available CORS proxy server to test your streams. (Please note that such third-party software as referenced here is not controlled by Google. Google cannot guarantee that third-party software will operate as intended. Please proceed with caution.)
How do I enable all logging
In the Chrome Remote Debugger console, enable debug logging, by entering the following:
How to enable live streams?
Live streams should just work out of the box, provide the URL.
To start at "live" you can specify the Infinity property as the initialTime parameter to the player.load API call.
How to create my own Adaptive Bit Rate (ABR) algorithm?
Using trackBandwidth for each stream, you should be able to create a model of the environment, and then on getQualityLevel return the track you want to be loaded. player.getBufferDuration can be used as well to feed the ABR algorithm.
How to seek within a stream?
Seek normally, using mediaElement.currentTime = time in seconds;
While debugging, how can I fix my stream bandwidth?
getQualityLevel should always return the index of the playlist you want to use. You could also fix it by changing the url to always point at the correct bitrate at updateSegmentRequestInfo time.
I need cookies when I get my segments, how do I enable this?

For cross-domain requests, you need to override host.updateSegmentRequestInfo and set requestInfo.withCredentials to true. You will probably also need to set something in the headers or modify the URL to start the process. If you also need these for your manifests, then override host.updateManifestRequestInfo.

host.updateManifestRequestInfo = function(requestInfo) {
  if (!requestInfo.url) {
    requestInfo.url = this.url;
  requestInfo.withCredentials = true;

host.updateLicenseRequestInfo = function(requestInfo) {
  requestInfo.withCredentials = true;

host.updateSegmentRequestInfo = function(requestInfo) {
  requestInfo.withCredentials = true;
Do you have any tricks or ideas on how I can get my media to play faster?
The most basic is to make sure that when you encode your streams, you "Optimize" the stream -- make sure that all unnecessary atoms are removed, and atoms and tags are sorted before the data segments. This needs to be done before you segment your content.
Some developers choose not to encrypt the start of their content. This way they can start playing before the license server has completed providing the keys. This is typically no more than 30 seconds of content.
The second movie I try to play -- doesn't
You need to make sure you call player.unload() before loading another movie.
I'm getting Media Element 'pause' events when I'm not expecting them - whats going on?
Your 'pause' event handler should check player.getState['underflow'] (a boolean), and if it's true, bring up your 'buffering' UI, otherwise, the paused UI.
How can I detect the buffering state?
When you are using MPL for adaptive streaming, you should check the MPL state for data underflow instead of checking for waiting/stalled events of the media element. MPL pauses playback to buffer enough media data in the source buffer to resume playback without stutter.

Here is some sample code:

  1. Listen to the pause and play events on the media element.
    videoElement.addEventListner('pause', onPause);
    videoElement.addEventListner('play', onPlay);
  2. In the onPause handler, call cast.api.Player.getState and check the underflow properly of the return value. If it is true, assume that the receiver is buffering. If it is false, the video is paused for a different reason, for example the video was paused by the user.
    function onPause() {
      if (castPlayer.getState().underflow) {
        // video is paused because of buffering
        // handing buffering event here
      } else {
        // video is paused for the other reason.
  3. In the onPlay handler, you can assume that buffering ends (and playback resumes).
    function onPlay() {
      // Now buffering ends.
How can I play multiple streams without the receiver getting an additional onLoad(), such as for Ads?

When the receiver sees the mediaElement.onEnded event, it stops updating status. To fix this, you need to intercept the 'ended' message with your own method, then call the default implementation at the real end of playback. Note - you'll need to capture this before starting up the Media Player Library, or anything else that might add to the event.

window.defaultOnEnded = window.mediaManager.onEnded;  // grab the default
window.mediaManager.onEnded = function() {
  // any action you need
  if(theLastVideoToPlay) {
    window.defaultOnEnded();  // call the default behavior
How do I set the initial bandwidth
To set initial bandwidth used by the ABR quality manager from the default of 2Mbps
host.initialBandwidth = 3 * 1024 * 1024; // 3 MBPS