Click here to see your recently viewed pages and most viewed pages.
Hide

Using the Media Player Library

The Google Cast Media Player Library, described in the Media Player Library API reference, makes it easy for you to build rich media experiences for your users. It provides JavaScript support for parsing manifests and playing HTTP Live Streaming (HLS), MPEG-DASH, and Smooth Streaming content. It also provide support for HLS AES encryption, PlayReady DRM, and Widevine DRM.

Your receiver app accesses the Media Player Library with the following reference:

//www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js

In production applications, you should not specify the protocol when sourcing the Media Player Library, so as to allow the resource to be loaded with the protocol that the host server uses, which must be HTTPS. While testing you may use HTTP. However, if you are implementing AES-128 encrypted HLS media streams, you must specify that the Media Player Library be sourced over HTTPS, even during testing.

Using Cross-Origin Resource Sharing (CORS)

Google Cast fully supports Cross-Origin Resource Sharing. Streaming protocols, unlike most file based protocols, access content in an asynchronous way using XMLHTTPRequest. In a CORS world, these requests are guarded against inappropriate access by the CORS header from the server where the resource originates. This means that your content's server has a say on where it may be included. Most modern browsers fully support CORS. iOS and Android devices access the content at a lower level and do not look at these headers. This is often the first issue that comes up when a developer wishes to use streaming content.

Many developers have found that that their servers are configured to need CORS, when they start to fix it, they progressivly discover that all the assets need CORS, you can choose to keep up with this, or not based on the configuration of your server(s) and CDN's. The assets that may need CORS headers include manifests, secondary manifests, segments, and crypto keys.

The way to setup your CORS headers is to give your media server all the addresses of the servers where you host your receiver application, (development, staging, and external servers). For example, Google Cast stores sample files on Google Cloud Storage. To enable all receivers, no matter where they are hosted, Google Cast uses the following permisive cors.json file.

[
  {
    "origin": ["*"],
    "responseHeader": ["Content-Type"],
    "method": ["GET", "HEAD"],
    "maxAgeSeconds": 86400
  }
]

Note, a permissive CORS header like this is unlikely suitable for production. Typically, you would replace the '*' (in "origin") with the URL of your receiver.

This header is set with the gsutil command:

$ gsutil cors set cors.json gs://location
Setting CORS on gs://location/...

CORS must be enabled on any website(s) where you host media content, including your CDN. The website, Enable-CORS provides examples for many other servers including Apache.

Loading adaptive bitrate media streams

For the adaptive bitrate streaming protocols, where you must implement CORS, the recommended way to pass the URL is to include the mime type in the contentType property of the LOAD command, as follows:

  • HLS: application/x-mpegurl or application/vnd.apple.mpegurl
  • DASH: application/dash+xml
  • Smooth Streaming: application/vnd.ms-sstr+xml

Demonstration code

The demonstration code provided below can be used as-is. You just need to do the following:

  • Host it on your server.
  • Enable CORS for your content on your server (manifests, sub-manifests, and segments).
  • Connect your sender application to your receiver.
  • Issue a Load command with the URL.

This code currently implements a trivial parser to figure out the content type based on the extension of the file. It is for example only; you should implement your own mechanism to do this.

  • HLS: .m3u8
  • DASH: .mpd
  • SmoothStreaming: .ism

You can, of course, modify the code, if you are serving something different.

Host, player, and protocol

The cast.player.api.Player listens to the provided media element and media source events. A Player instance funnels the data from a Host source to a media element. The Host is a cast.player.api.Host object that you instantiate with a media element and a URL to the media. The Player instance can load a protocol of the type supported based on the manifest of the media content.

Creating a Host object

Create a default host object constructor with your URL and MediaElement as follows:

var host = new cast.player.api.Host({'mediaElement':mediaElement, 'url':url});

If your server requires CORS and cookie information in order to access the media, set the property withCredentials to true and set the header information as needed.

host.updateSegmentRequestInfo = function(requestInfo) {
  // example of setting CORS withCredentials
  requestInfo.withCredentials = true;
  // example of setting headers
  requestInfo.headers = {};
  requestInfo.headers['content-type'] = 'text/xml;charset=utf-8';
};

It is usually a good idea to have some error handling or reporting when there is problem with the host. Thankfully, all you need to do is add callbacks to the host object. Also, override any desired methods by replacing them in the host.

host.onError = function(errorCode) {
  console.log("Fatal Error - " + errorCode);
  if (window.player) {
    window.player.unload();
    window.player = null;
  }
};

Creating a Player object

A Player instance is then created by calling the constructor function and passing in the Host object.

// Create a Player
window.player = new cast.player.api.Player(host);

Before a Player can actually begin playback of the media it needs to know how the media data will be provided. This is accomplished by creating and providing the appropriate StreamingProtocol object when asking the Player instance to load the media data. There are three provided functions within the Cast Media Player Library for creating the appropriate adaptive bit rate protocol object.

Creating a Protocol object

Create the protocol handler as follows:

protocol = cast.player.api.CreateHlsStreamingProtocol(host);

or

protocol = cast.player.api.CreateDashStreamingProtocol(host);

or

protocol = cast.player.api.CreateSmoothStreamingProtocol(host);

A mechanism for implementing your own streaming protocol is also planned.

Now, with the protocol in hand, you can tell the Player object that it should load the media using the provided protocol and start at a specific position (typically 0).

window.player.load(protocol, initStart);

Here's the full code example:

<body>
<video id='vid' />
<script type="text/javascript"
    src="//www.gstatic.com/cast/sdk/libs/receiver/<version_number>/cast_receiver.js">
</script>
<script type="text/javascript"
    src="//www.gstatic.com/cast/sdk/libs/mediaplayer/<version_number>/media_player.js">
</script>

<script type="text/javascript">
// If you set ?Debug=true in the URL, such as a different App ID in the
// developer console, include debugging information.
if (window.location.href.indexOf('Debug=true') != -1) {
  cast.receiver.logger.setLevelValue(cast.receiver.LoggerLevel.DEBUG);
  cast.player.api.setLoggerLevel(cast.player.api.LoggerLevel.DEBUG);
}

var mediaElement = document.getElementById('vid');

// Create the media manager. This will handle all media messages by default.
window.mediaManager = new cast.receiver.MediaManager(mediaElement);

// Remember the default value for the Receiver onLoad, so this sample can Play
// non-adaptive media as well.
window.defaultOnLoad = mediaManager.onLoad;
mediaManager.onLoad = function (event) {
// The Media Player Library requires that you call player unload between
// different invocations.
  if (window.player !== null) {
    player.unload();    // Must unload before starting again.
    window.player = null;
  }
// This trivial parser is by no means best practice, it shows how to access
// event data, and uses the a string search of the suffix, rather than looking
// at the MIME type which would be better.  In practice, you will know what
// content you are serving while writing your player.
  if (event.data['media'] && event.data['media']['contentId']) {
    console.log('Starting media application');
    var url = event.data['media']['contentId'];
// Create the Host - much of your interaction with the library uses the Host and
// methods you provide to it.
    window.host = new cast.player.api.Host(
      {'mediaElement':mediaElement, 'url':url});
    var ext = url.substring(url.lastIndexOf('.'), url.length);
    var initStart = event.data['media']['currentTime'] || 0;
    var autoplay = event.data['autoplay'] || true;
    var protocol = null;
    mediaElement.autoplay = autoplay;  // Make sure autoplay get's set
    if (url.lastIndexOf('.m3u8') >= 0) {
// HTTP Live Streaming
      protocol = cast.player.api.CreateHlsStreamingProtocol(host);
    } else if (url.lastIndexOf('.mpd') >= 0) {
// MPEG-DASH
      protocol = cast.player.api.CreateDashStreamingProtocol(host);
    } else if (url.indexOf('.ism/') >= 0) {
// Smooth Streaming
      protocol = cast.player.api.CreateSmoothStreamingProtocol(host);
    }
// How to override a method in Host. I know that it's safe to just provide this
// method.
    host.onError = function(errorCode) {
      console.log("Fatal Error - " + errorCode);
      if (window.player) {
        window.player.unload();
        window.player = null;
      }
    };
// If you need cookies, then set withCredentials = true also set any header
// information you need.  If you don't need them, there can be some unexpected
// effects by setting this value.
//      host.updateSegmentRequestInfo = function(requestInfo) {
//        requestInfo.withCredentials = true;
//      };
    console.log("we have protocol " + ext);
    if (protocol !== null) {
      console.log("Starting Media Player Library");
      window.player = new cast.player.api.Player(host);
      window.player.load(protocol, initStart);
    }
    else {
      window.defaultOnLoad(event);    // do the default process
    }
  }
}
window.player = null;
console.log('Application is ready, starting system');
window.castReceiverManager = cast.receiver.CastReceiverManager.getInstance();
castReceiverManager.start();
</script>
</body>

See the Media Player API Reference for additional classes and methods.

Cycling through audio streams

Here's some code for cycling through audio streams.

window.changeLanguage = function() {
 var currentLanguage = null;
 var streamCount = this.protocol_.getStreamCount();
 var streamInfo;
 for (var i = 0; i < streamCount; i++) {
   if (protocol.isStreamEnabled(i)) {
     streamInfo = protocol.getStreamInfo(i);
     if (streamInfo.mimeType.indexOf('audio') === 0) {
       if (streamInfo.language) {
         currentLanguage = i;
         break;
       }
     }
   }
 }

 if (currentLanguage === null) {
   currentLanguage = 0;
 }

 i = currentLanguage + 1;
 while (i !== currentLanguage) {
   if (i === streamCount) {
     i = 0;
   }

   streamInfo = protocol.getStreamInfo(i);
   if (streamInfo.mimeType.indexOf('audio') === 0) {
     protocol.enableStream(i, true);
     protocol.enableStream(currentLanguage, false);
     break;
   }

   i++;
 }

 if (i !== currentLanguage) {
   this.player_.reload();
 }
};

Closed captioning

There are several ways to enable closed-captioning in your receiver.

TTML

Use TTML - Timed Text Markup Language with cues covering duration of the video.

To enable:

player.enableCaptions(true, 'ttml', url)

To disable:

player.enableCaptions(false, 'ttml')

WebVTT

Use WebVTT - Web Video Text Tracks with cues covering duration of the video.

Add a track element to HTML and set the src property as described in the following documents:

Segmented TTML & WebVTT

Use Segmented TTML for Smooth Streaming and WebVTT - Web Video Text Tracks for HLS.

To enable:

protocol_.enableStream(streamIndex, true);
player_.enableCaptions(true);

To disable:

protocol_.enableStream(streamIndex, false);
player_.enableCaptions(false);

Cycling through caption streams

Here is some sample code to cycle through the caption streams with segmented TTML for Smooth Streaming or WebVTT for HLS.

sample.App.prototype.changeCaptions = function() {
  var current, next;
  var streamCount = this.protocol_.getStreamCount();
  var streamInfo;
  for (current = 0; current < streamCount; current++) {
    if (this.protocol_.isStreamEnabled(current)) {
      streamInfo = this.protocol_.getStreamInfo(current);
      if (streamInfo.mimeType.indexOf('text') === 0) {
        break;
      }
    }
  }

  if (current === streamCount) {
    next = 0;
  } else {
    next = current + 1;
  }

  while (next !== current) {
    if (next === streamCount) {
      next = 0;
    }

    streamInfo = this.protocol_.getStreamInfo(next);
    if (streamInfo.mimeType.indexOf('text') === 0) {
      break;
    }

    next++;
  }

  if (next !== current) {
    if (current !== streamCount) {
      this.protocol_.enableStream(current, false);
      this.player_.enableCaptions(false);
    }

    if (next !== streamCount) {
      this.protocol_.enableStream(next, true);
      this.player_.enableCaptions(true);
    }
  }
};

Styling WebVTT Captions

WebVTT captions can be styled using video::cue { /** style here, applies to all cues **/ } in your css.

Cast Media Player Streaming DRM sample

The Google Cast open source Streaming DRM sample app on GitHub demonstrates key features of the Media Player Library and covers both the Cast sender and receiver aspects. It uses a custom media channel for communication between sender and receiver to exercise these key features.

The sender code (index.html and sender.js) uses the Chrome Sender API and provides access to choices of streams in MPEG-DASH format, on-demand and Http Live Streaming (HLS), as well as custom media sources.

The receiver app (sample mpl.html and mpl.js) features a media element overlaid with a debugging UI. You can use the interface to test and debug the MPL features that interest you. Both the media element and debugging info layers can be turned on and off from the sender side to make debugging easier. The following code snippets show features such as closed captions, adaptive bitrate streams, and DRM/license servers.

Closed captions

The following code snippets show how to display external captions in either Web VTT or TTML format:

Sender:

sendMessage({'type':'WebVTT'});

Receiver:

mediaPlayer.enableCaptions(true,'webvtt','captions.vtt');

Here, the sender's sendMessage method passes a message via a custom message channel, as follows:

/**
 * @param {string} message A message string
 * @this sendMessage
 */
function sendMessage(message) {
  if (session != null) {
    session.sendMessage(MESSAGE_NAMESPACE, message, onSuccess, onError);
  }
}

For in-stream captions, you can send a message with track number:

Sender:

  message = {
    type: 'ENABLE_CC',
    trackNumber: trackNumber
  };
}
session.sendMessage(MESSAGE_NAMESPACE, message, onSuccess, onError);

Receiver:

protocol.enableStream(trackNumber, true);
mediaPlayer.enableCaptions(true);

Set video/audio bitrates

Sender:

/**
 * send a custom message to receiver
 * @param {string} qualityIndex A string representing quality index
 * @param {string} mediaType A media type string
 */
function setQualityLevel(qualityIndex, mediaType) {
  sendMessage({'type': 'qualityIndex', 'value': qualityIndex,
    'mediaType': mediaType});
}

Receiver:

Override method getQualityLevel
  mediaHost.getQualityLevel = function(streamIndex, qualityLevel) {
    if (streamIndex == videoStreamIndex && videoQualityIndex != -1) {
      return videoQualityIndex;
    } else if (streamIndex == audioStreamIndex &&
        audioQualityIndex != -1) {
      return audioQualityIndex;
    } else {
      return qualityLevel;
    }
  };

DRM (PlayReady/Widevine)

Set the license server URL.

Sender:

sendMessage({'type': 'license', 'value': licenseUrl});

Receiver:

mediaHost.licenseUrl = licenseUrl;

Set license custom data.

Sender:

sendMessage({'type': 'customData','value': customData});

Receiver:

mediaHost.licenseCustomData = customData;

Frequently Asked Questions

Despite whatever URL I pass to the cast device, I don't get any video up. When I play the videos on my local machine they play.
If you're having problems playing streams on a Cast device, it may be an issue with CORS. Use a publicly available CORS proxy server to test your streams. (Please note that such third-party software as referenced here is not controlled by Google. Google cannot guarantee that third-party software will operate as intended. Please proceed with caution.)
How do I enable all logging
In the Chrome Remote Debugger console, enable debug logging, by entering the following:
cast.player.api.setLoggerLevel(cast.player.api.LoggerLevel.DEBUG);
How to enable live streams?
Live streams should just work out of the box, provide the URL.
To start at "live" you can specify the Infinity property as the initialTime parameter to the player.load API call.
How to create my own Adaptive Bit Rate (ABR) algorithm?
Using trackBandwidth for each stream, you should be able to create a model of the environment, and then on getQualityLevel return the track you want to be loaded. player.getBufferDuration can be used as well to feed the ABR algorithm.
How to seek within a stream?
Seek normally, using mediaElement.currentTime = time in seconds;
While debugging, how can I fix my stream bandwidth?
getQualityLevel should always return the index of the playlist you want to use. You could also fix it by changing the url to always point at the correct bitrate at updateSegmentRequestInfo time.
I need cookies when I get my segments, how do I enable this?

For cross-domain requests, you need to override host.updateSegmentRequestInfo and set requestInfo.withCredentials to true. You will probably also need to set something in the headers or modify the URL to start the process. If you also need these for your manifests, then override host.updateManifestRequestInfo.

host.updateManifestRequestInfo = function(requestInfo) {
  if (!requestInfo.url) {
    requestInfo.url = this.url;
  }
  requestInfo.withCredentials = true;
};

host.updateLicenseRequestInfo = function(requestInfo) {
  requestInfo.withCredentials = true;
};

host.updateSegmentRequestInfo = function(requestInfo) {
  requestInfo.withCredentials = true;
};
Do you have any tricks or ideas on how I can get my media to play faster?
The most basic is to make sure that when you encode your streams, you "Optimize" the stream -- make sure that all unnecessary atoms are removed, and atoms and tags are sorted before the data segments. This needs to be done before you segment your content.
Some developers choose not to encrypt the start of their content. This way they can start playing before the license server has completed providing the keys. This is typically no more than 30 seconds of content.
The second movie I try to play -- doesn't
You need to make sure you call player.unload() before loading another movie.
I'm getting Media Element 'pause' events when I'm not expecting them - whats going on?
Your 'pause' event handler should check player.getState['underflow'] (a boolean), and if it's true, bring up your 'buffering' UI, otherwise, the paused UI.
How can I detect the buffering state?
When you are using MPL for adaptive streaming, you should check the MPL state for data underflow instead of checking for waiting/stalled events of the media element. MPL pauses playback to buffer enough media data in the source buffer to resume playback without stutter.

Here is some sample code:

  1. Listen to the pause and play events on the media element.
    videoElement.addEventListner('pause', onPause);
    videoElement.addEventListner('play', onPlay);
    
  2. In the onPause handler, call cast.api.Player.getState and check the underflow properly of the return value. If it is true, assume that the receiver is buffering. If it is false, the video is paused for a different reason, for example the video was paused by the user.
    function onPause() {
      if (castPlayer.getState().underflow) {
        // video is paused because of buffering
        // handing buffering event here
      } else {
        // video is paused for the other reason.
      }
    }
    
  3. In the onPlay handler, you can assume that buffering ends (and playback resumes).
    function onPlay() {
      // Now buffering ends.
    }
    
How can I play multiple streams without the receiver getting an additional onLoad(), such as for Ads?

When the receiver sees the mediaElement.onEnded event, it stops updating status. To fix this, you need to intercept the 'ended' message with your own method, then call the default implementation at the real end of playback. Note - you'll need to capture this before starting up the Media Player Library, or anything else that might add to the event.

window.defaultOnEnded = window.mediaManager.onEnded;  // grab the default
window.mediaManager.onEnded = function() {
  // any action you need
  if(theLastVideoToPlay) {
    window.defaultOnEnded();  // call the default behavior
  }
}
How do I set the initial bandwidth
To set initial bandwidth used by the ABR quality manager from the default of 2Mbps
host.initialBandwidth = 3 * 1024 * 1024; // 3 MBPS