Custom Receiver Application

This document provides an overview of building a custom Google Cast receiver application. A Cast receiver is an application created using HTML, JavaScript and CSS. It is loaded onto a Cast device (for example, a Chromecast) through a URL that is accessible over the network to which the Cast device is connected.

Receiver Diagram

Google Cast Receiver SDK

Your receiver app accesses the Receiver API with the following reference:


Do not self-host the cast_receiver.js resource. Updates will be applied periodically to address bug fixes and new features. Self-hosting will prevent a receiver app from benefiting from these changes.

Do not include the protocol in the URL when sourcing the cast_receiver.js resource. By not specifying http or https the resource can be fetched using the same protocol as the server hosting the receiver application. This is advantageous because it means that switching from http to https (receiver apps should be hosted on TLS capable servers) is transparent and will not require a code change.


Before developing a custom receiver application, you will need to register your app with the Google Cast SDK Developer Console. See Registration for more information. All receiver applications require sender applications to provide an app ID with the command messages they send to the receiver through the sender API. When you register your application, you will receive the app ID to include in your sender's API calls.

Device capabilities

The CAST-DEVICE-CAPABILITIES HTTP header provided by the Cast device during application launch describes which aspects of a media stream the device supports to allow receiver apps to optimize for those supported aspects. For example, an audio-only device sends a request with this header and the server hosting the receiver application can redirect the request to a version of the receiver application optimized for Google Cast for audio devices.

The header contains a JSON-encoded string with an object describing the device capabilities as key/value pairs. For example, here's a header describing "display_supported":false for Google Cast for audio devices:

GET /player_playlist_receiver.html HTTP/1.1
Connection: keep-alive
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
User-Agent: Mozilla/5.0 (X11; Linux armv7l) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.51 Safari/537.36 CrKey/1.12.26924
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
CAST-DEVICE-CAPABILITIES: {"display_supported":false}

This same information is available in the ApplicationData object, and you can call CastReceiverManager.getDeviceCapabilities() to return the device capabilities object specifically once the app is loaded.

Here is an example of checking if the receiver device is capable of playing HDR and DolbyVision (DV):

const castReceiverManager = cast.receiver.CastReceiverManager.getInstance();
castReceiverManager.onReady = (event) => {
  const deviceCapabilities = cast.receiver.CastReceiverManager.getInstance().getDeviceCapabilities();
  if (deviceCapabilities && deviceCapabilities['is_hdr_supported']) {
    // hdr supported
  if (deviceCapabilities && deviceCapabilities['is_dv_supported']) {
    // dv supported

Application life cycle

The receiver application life cycle starts from the point at which the receiver is loaded onto the Cast device and proceeds to the point at which the application is torn down and the Cast device reverts back to its default state.

Receiver Life Cycle

Over the life cycle of a receiver application, messages are exchanged between the receiver and any connected sender applications. A sender application will send an initial message to a Google Cast device requesting a session be created using a specific application ID. This kicks off the life cycle of the receiver as the Google Cast device will attempt to load the receiver application. Assuming there are no network issues, the receiver application will be downloaded from the network using the resolved URL associated with the application ID. Once loaded, the receiver application will perform its setup operations and indicate that it is ready to process messages from any connected sender applications.

A receiver application may tear down (end its current life cycle) under the following conditions:

  • The receiver application gets a message from a connected sender to end the application session.
  • The receiver application has been idle for a defined period of time without any connected senders and decides to end the application session.
  • The receiver encounters a fatal error during its normal life cycle (rare but possible).

Initialization and readiness

When a Google Cast receiver application starts up there is a minimum set of tasks that should be completed.

  • Get and hold the provided instance of the CastReceiverManager.
  • Override/provide any event listeners on the CastReceiverManager.
  • For media applications, identify the primary media element and connect it to a MediaManager.
  • For media applications, override/provide any event listeners on the MediaManager.

    See Media, below, for more information on the default event listeners.

  • For custom applications or media applications with custom protocol extensions (namespaces), call getCastMessageBus for the namespaces to be used.
  • Call start on the CastReceiverManager to indicate the receiver application is ready to receive messages.

Here is an example of a simple custom media application.

  <title>Example minimum receiver</title>
  <script src="//"></script>
  <video id='media'/>
    window.mediaElement = document.getElementById('media');
    window.mediaManager = new cast.receiver.MediaManager(window.mediaElement);
    window.castReceiverManager = cast.receiver.CastReceiverManager.getInstance();

Session management

A receiver application may be interested in tracking when a sender connects or disconnects from it. This is accomplished using the CastReceiverManager singleton instance whereby the receiver application registers listener functions on the onSenderConnected and onSenderDisconnected properties.

When a receiver application is started through a sender requesting a session, and subsequently, when sender applications connect to an active receiver, the receiver is notified of these connections. Similarly, for the disconnection of senders, the receiver application is notified when a sender disconnects from its active session.

The disconnect event provides a reason field that can be used to determine why the connection to the receiver was dropped. The receiver app will be able to distinguish between explicit disconnect events when the user clicks on the disconnect button in the Cast menu and other disconnects caused by WiFi issues. The design checklist recommends that if the user explicitly disconnects from the receiver, the receiver should close if no other senders are connected. This logic can be implemented in JavaScript in the receiver:

window.castReceiverManager.onSenderDisconnected = function(event) {
  if(window.castReceiverManager.getSenders().length == 0 &&
    event.reason == cast.receiver.system.DisconnectReason.REQUESTED_BY_SENDER) {

When the user selects to explicitly disconnect, the sender app should close the connection to the receiver and not stop the receiver application.

Application configuration

When a receiver application calls on the CastReceiverManager.start function, it may provide optional configuration data.

 * Application config
var appConfig = new cast.receiver.CastReceiverManager.Config();

 * Text that represents the application status. It should meet
 * internationalization rules as may be displayed by the sender application.
 * @type {string|undefined}
appConfig.statusText = 'Ready to play';

 * Maximum time in seconds before closing an idle
 * sender connection. Setting this value enables a heartbeat message to keep
 * the connection alive. Used to detect unresponsive senders faster than
 * typical TCP timeouts. The minimum value is 5 seconds, there is no upper
 * bound enforced but practically it's minutes before platform TCP timeouts
 * come into play. Default value is 10 seconds.
 * @type {number|undefined}
// 100 minutes for testing, use default 10sec in prod by not setting this value
appConfig.maxInactivity = 6000;
 * Initializes the system manager. The application should call this method when
 * it is ready to start receiving messages, typically after registering
 * to listen for the events it is interested on.

If the receiver API detects that a sender is disconnected it will raise the senderdisconnected event. If the receiver has not been able to communicate with the sender for maxInactivity seconds, it will also raise the senderdisconnected event.

During development it is a good idea to set maxInactivity to a high value. For a published receiver application it is better to not set this value and instead rely on the default value.


Message exchange is the key interaction method for receiver applications. Sender applications can command and control a receiver application using messages. Similarly, receiver applications can keep senders informed about the state of the receiver by sending messages to connected senders.

A sender issues messages to a receiver using the sender APIs for the platform the sender is running (Android, iOS, Chrome). A receiver application can also send messages using the Google Cast Receiver APIs. A receiver can send messages to an individual sender, either in response to a received message or due to an application state change. Beyond point-to-point messaging, a receiver may also broadcast messages to all connected senders.

The event object (which is the manifestation of a message) that is passed to the event listeners has a data element ( where the data takes on the properties of the specific event type. For example, in the case of the onSystemVolumeChanged event, the data is made up of all the properties of the type. So, to get the volume level, the application can query the value.

Namespace and protocols

A namespace is a labeled protocol. That is, messages that are exchanged throughout the Google Cast ecosystem utilize namespaces to identify the protocol of the message being sent. Messages are string-based, but the encoding is specific to the namespace. A common encoding mechanism is JSON and it is used for example by the Cast media protocol (media namespace).

Namespaces are a powerful mechanism for standardizing protocols to be used by multiple sender applications. For example an application developer may decide to implement and define a library for senders that implements their own custom protocol. In this way a community of developers can create applications that know how to communicate with a common receiver application. This is the reason the cast media namespace is standardized, so a generic remote control can be created.

Receiver applications use the cast.receiver.CastMessageBus to manage messaging within a specific namespace. As well, a receiver may communicate directly with a connected sender by asking the CastMessageBus for a cast.receiver.CastChannel which will provide a context for messages to be exchanged within the namespace defined for the CastMessageBus.

A receiver application may choose to listen for messages on a specified namespace. By virtue of doing so, the receiver application is said to support that namespace protocol. It is then up to any connected senders wishing to communicate on that namespace to use the appropriate protocol.

var customMessageBus = castReceiverManager.getCastMessageBus('urn:x-cast:super.awesome.example');
customMessageBus.onMessage = function(event) {
   // Handle message

All namespaces are defined by a string and must begin with urn:x-cast: followed by any string. For example, urn:x-cast:com.example.cast.mynamespace.

Media messages sent using the Google Cast APIs on both the sender and receiver use the media namespace protocol by convention. See Media messages, below.


Most receiver applications will utilize media in some form or another. The Google Cast APIs have built-in support for basic media operations such as load, play, pause and stop. For a complete list of supported operations refer to the Media Playback Messages. The Google Cast API object MediaManager provides automatic message handling for media namespace messages. The MediaManager object is a first-responder for media events that are issued on the

var mediaElement = document.getElementById('media');  // eg. <video id='media'/>
var mediaManager = new cast.receiver.MediaManager(mediaElement);

This allows Google Cast sender applications to utilize their respective platform APIs to send media messages that will be processed automatically by the receiver’s instantiated MediaManager object. This approach is appropriate for simple media such as mp4, png, mp3, etc.

You can determine if a particular codec is supported with a call to the DOM media element method, canPlayType(). For example:

canPlayType('audio/mp4; codecs="aac51"')

If a receiver application wants to listen in and intercept media events it may do so by registering event listeners on the MediaManager instance.

In most cases (to style your UI), it is recommend to register to the events from the HTML5 media element itself, so instead of registering for the Cast PLAY command, it is better to listen for the HTML5 video element playing event. In this way the receiver code is more reusable with a web implementation. The goal of the MediaManager is not to wrap the HTML5 media element but to provide enough hooks for the management of Cast communications/messages.

mediaManager['origOnLoad'] = mediaManager.onLoad;
mediaManager.onLoad = function (event) {

Usually, where a receiver application overrides the default event handler it will need to call on the default handler after doing some processing of the event. The simple pattern of storing the original function as a property of the object and calling upon it later works very well in this situation.

mediaManager.onPlay = (function() {
    mediaManager.origOnPlay = mediaManager.onPlay;
    return function(event) {
        // … do whatever is needed for the receiver application logic

If your receiver needs to load and play media that utilizes Adaptive Streaming, DRM or other advanced media transmission, you can utilize the Google Cast Media Player Library to build in this functionality. See Using the Media Player Library.

Media messages

Google Cast media control messages use the namespace protocol. The MediaManager is a first-responder to media control messages. You may add event listeners as detailed earlier.

See the Media Playback Messages reference for details on all messages.

A simplified perspective on the life cycle of media control goes like this:

  • A media load message is sent from a connected sender to the receiver (see the Chrome sender's loadMedia method, for example).
  • The receiver processes the message through the MediaManager which sets the media element’s src attribute.
  • Ignoring meta data loads etc., the MediaManager sends a media status update to the connected senders indicating that the media is ready. From there the sender can issue a play media message (see the Chrome sender's sendMessage method for example) to tell the receiver to begin playback.
  • The receiver’s MediaManager will tell the media element (for example, <video ...>) to play and broadcast the media status to all connected senders. Senders can then issue pause, stop, seek and volume messages to the receiver which will likewise be routed to the media element. For each change in media state there are corresponding broadcast messages that the receiver will issue to connected senders.

When providing event listeners to the instantiated MediaManager, remember to either store and call on the default handler function or implement equivalent logic.

If the receiver needs to include custom information in the media status messages returned to senders, provide a customizedStatusCallback function, which allows the receiver application to customize the status message returned.

Media events

The MediaManager receives media command messages from the sender. Based on those commands, the MediaManager interacts with the video/audio HTML5 media element. If there are state changes, the MediaManager will send status messages to the senders connected to the media session, so they can display a second screen UI.

When updating the media, use media related events fired by <audio>/<video> elements, like timeupdate, pause and waiting. Avoid using networking related events like progress, suspend and stalled, as these tend to be platform dependent.

The states sent via the Cast media protocol provide the state view that the sender needs to be aware of so it can properly display a second screen UI (for example, a progress scroll bar). The following rules apply:

  • If the player has no media loaded or has an error or the media has ended, it will be in the IDLE state. At this point there is no media Session.
  • If the player is paused, as returned by the paused property of the media element, the state is PAUSED.
  • If the player has not changed the currentTime property for one second, the state is BUFFERING.
  • Otherwise the state is PLAYING.

Receiver applications can tweak this logic by applying their own heuristics and changing the state in the customizedStatusCallback API. This API will provide the current state, and the application can return a modified state.

The receiver app can also not to go to IDLE after ended or error by overriding onEnded/onError, if this makes sense in a particular scenario. Also, the receiver UI typically should not use these sender-oriented states, as it has full access to the media element events and states which can provide more granularity, for example, buffering.

The basic media receiver state view provides visibility of the internal LOADING state that is not exposed to senders (from the sender perspective the receiver is BUFFERING during LOAD). The following diagram is only important if the application wants to override onLoad as it will need to be sure that the loadedmetadata event is triggered by the media element (this event is what the MediaManager uses to transition to PLAYING/PAUSED, depending on the value of autoplay).

Receiver State Machine

Further note that if you override the onLoad() function, you must be aware of the following:

  • The transition from LOADING to PLAYING/PAUSED is decided by the MediaManager receiving the loadedmetadata event from the video/audio element. An application can override onLoad() but then it must guarantee that the loadedmetadata event will be raised (by calling load() on the video/audio element). If the loadedmetadata event is not raised (due to the internal design of the application player) then the application must call sendLoadComplete() or sendLoadError().
  • If an error happens during LOAD, and there is no error event on the video/audio element (because the error is caused by application logic), the application must take the player to the IDLE state by calling sendLoadError().

When testing your player, be sure that if you override onLoad() the callbacks, onMetadataLoaded and onLoadMetadataError are still properly called. This is the best way to guarantee that your player will behave as expected by the sender.

Receiver State Machine

The sender view of the receiver state is very simple, only four states. The goal of this simplified view is to have the sender display a second screen UI, including a progress bar. If needed, applications can always customize these states by using customizedStatusCallback and include sub-states in the customData field.

Sender State Machine


Layouts for TV have some unique requirements due to the evolution of TV standards and the desire to always present a full screen picture to viewers. TV devices may clip the outside edge of an app layout in order to ensure that the entire display is filled. This behavior is generally referred to as overscan. Avoid screen elements getting clipped due to overscan by incorporating a 10% margin on all sides of your layout.


On the receiver side, you implement the autoplay UI and queueing APIs as described in the following sections. The code samples here refer to the autoplay logic as "preview" or "PreviewMode".

Respond to the PRELOAD event

As described in Create and load media queue items, if you are using adaptive content with the Media Player Library (MPL), items added to a queue can specify an optional preload time value. On the receiver, the MediaManager onPreload callback may be invoked for the PRELOAD event when the next item in the queue has a preload time value:

this.mediaManager_.onPreload = this.onPreload_.bind(this);

sampleplayer.CastPlayer.prototype.onPreload_ = function(event) {
  var loadRequestData =
      /** @type {!cast.receiver.MediaManager.LoadRequestData} */ (;
  return this.preload(;

As noted earlier, preloading video data is only supported for adaptive streams (video, not audio) loaded by MPL. For other media formats like MP4 files, the autoplay UI can still be displayed, but the video data won’t actually be preloaded. In the example onPreload callback above, there is a call to a preload function that calls canDisplayPreview to handle content that cannot be preloaded. Adaptive streams supported by MPL are identified by the supportsPreload_ function. A preloaded player (preloadPlayer_) is initialized by calling preloadVideo_.

sampleplayer.CastPlayer.prototype.preload = function(mediaInformation) {
  // For media that cannot be preloaded, display preload UI.
  if (sampleplayer.canDisplayPreview_(mediaInformation || {})) {
    this.displayPreviewMode_ = true;
    return true;
  if (!sampleplayer.supportsPreload_(mediaInformation || {})) {
	this.log_('preload: no supportsPreload_');
    return false;
  if (this.preloadPlayer_) {
    this.preloadPlayer_ = null;
  // Only videos are supported.
  var couldPreload = this.preloadVideo_(mediaInformation);
  if (couldPreload) {
    this.displayPreviewMode_ = true;
  this.log_('preload: couldPreload='+couldPreload);
  return couldPreload;

sampleplayer.canDisplayPreview_ = function(media) {
  var contentId = media.contentId || '';
  var contentUrlPath = sampleplayer.getPath_(contentId);
  if (sampleplayer.getExtension_(contentUrlPath) === 'mp4') {
    return true;
  } else if (sampleplayer.getExtension_(contentUrlPath) === 'ogv') {
    return true;
  } else if (sampleplayer.getExtension_(contentUrlPath) === 'webm') {
    return true;
  return false;

sampleplayer.supportsPreload_ = function(media) {
  return sampleplayer.getProtocolFunction_(media) != null;

Create the video player instance

Create a video player instance for the preloaded video as follows:

sampleplayer.CastPlayer.prototype.preloadVideo_ = function(mediaInformation) {
  var self = this;
  var url = mediaInformation.contentId;
  var protocolFunc = sampleplayer.getProtocolFunction_(mediaInformation);
  if (!protocolFunc) {
    this.log_('No protocol found for preload');
    return false;
  var host = new cast.player.api.Host({
    'url': url,
    'mediaElement': self.mediaElement_
  host.onError = function() {
    self.preloadPlayer_ = null;
    self.displayPreviewMode_ = false;
    self.log_('Error during preload');
  self.preloadPlayer_ = new cast.player.api.Player(host);
  return true;

Load the next video

When loading the next video for playback, check to see if there already exists a preloaded player, as shown in bold in the following example:

sampleplayer.CastPlayer.prototype.loadVideo_ = function(info) {
  var self = this;
  var protocolFunc = null;
  var url =;
  var protocolFunc = sampleplayer.getProtocolFunction_(;
  var wasPreloaded = false;

  if (!protocolFunc) {
    this.log_('loadVideo_: using MediaElement');
    this.mediaElement_.addEventListener('stalled', this.bufferingHandler_,
    this.mediaElement_.addEventListener('waiting', this.bufferingHandler_,
  } else {
    this.log_('loadVideo_: using Media Player Library');
    // When MPL is used, buffering status should be detected by
    // getState()['underflow]'
    this.mediaElement_.removeEventListener('stalled', this.bufferingHandler_);
    this.mediaElement_.removeEventListener('waiting', this.bufferingHandler_);

    // If we have not preloaded or the content preloaded does not match the
    // content that needs to be loaded, perform a full load
    var loadErrorCallback = function() {
      // unload player and trigger error event on media element
      if (self.player_) {
        self.mediaElement_.dispatchEvent(new Event('error'));
    if (!this.preloadPlayer_ || (this.preloadPlayer_.getHost &&
        this.preloadPlayer_.getHost().url != url)) {
      if (this.preloadPlayer_) {
        this.preloadPlayer_ = null;
      this.log_('Regular video load');
      var host = new cast.player.api.Host({
        'url': url,
        'mediaElement': this.mediaElement_
      host.onError = loadErrorCallback;
      this.player_ = new cast.player.api.Player(host);
    } else {
      this.log_('Preloaded video load');
      this.player_ = this.preloadPlayer_;
      this.preloadPlayer_ = null;
      // Replace the "preload" error callback with the "load" error callback
      this.player_.getHost().onError = loadErrorCallback;
      wasPreloaded = true;
  this.loadMediaManagerInfo_(info, !!protocolFunc);
  return wasPreloaded;

Persist the metadata

The autoplay UI displays the next tracks metadata similarly to that of the current media metadata. The metadata are kept in their own DOM elements since during autoplay the user can pause or seek the current stream, and so both the metadata of the current track and the next track needs to be persisted, as in the following example:

sampleplayer.CastPlayer.prototype.showPreviewModeMetadata = function(show) {
  this.element_.setAttribute('preview-mode', show.toString());

sampleplayer.CastPlayer.prototype.loadPreviewModeMetadata_ = function(media) {
  if (!sampleplayer.isCastForAudioDevice_()) {
    var metadata = media.metadata || {};
    var titleElement = this.element_.querySelector('.preview-mode-title');
    sampleplayer.setInnerText_(titleElement, metadata.title);

    var subtitleElement = this.element_.querySelector('.preview-mode-subtitle');
    sampleplayer.setInnerText_(subtitleElement, metadata.subtitle);

    var artwork = sampleplayer.getMediaImageUrl_(media);
    if (artwork) {
      var artworkElement = this.element_.querySelector('.preview-mode-artwork');
      sampleplayer.setBackgroundImage_(artworkElement, artwork);

Hide the UI

The Autoplay UI needs to be hidden when the current track ends, for example:

sampleplayer.CastPlayer.prototype.onEnded_ = function() {
  this.setState_(sampleplayer.State.DONE, true);

sampleplayer.CastPlayer.prototype.onAbort_ = function() {
  this.setState_(sampleplayer.State.IDLE, true);

sampleplayer.CastPlayer.prototype.hidePreviewMode_ = function() {
  this.displayPreviewMode_ = false;

Display a countdown timer

To display the autoplay countdown timer, the progress indicator logic is modified, as follows:

sampleplayer.CastPlayer.prototype.updateProgress_ = function() {
  // Update the time and the progress bar
  if (!sampleplayer.isCastForAudioDevice_()) {
    var curTime = this.mediaElement_.currentTime;
    var totalTime = this.mediaElement_.duration;
    if (!isNaN(curTime) && !isNaN(totalTime)) {
      var pct = 100 * (curTime / totalTime);
      this.curTimeElement_.innerText = sampleplayer.formatDuration_(curTime);
      this.totalTimeElement_.innerText = sampleplayer.formatDuration_(totalTime); = pct + '%'; = pct + '%';
      // Handle preview mode
      if (this.displayPreviewMode_) {
        this.previewModeTimerElement_.innerText = "" + Math.round(totalTime-curTime);

UI elements

Use the following HTML elements for the autoplay UI:

<div class="preview-mode-info">
  <div class="preview-mode-artwork"></div>
    <div class="preview-mode-text">
     <div class="preview-mode-timer">
        <div class="preview-mode-timer-starts">Up next in </div>
        <div class="preview-mode-timer-countdown"></div>
        <div class="preview-mode-timer-sec"> secs...</div>
    <div class="preview-mode-title"></div>
    <div class="preview-mode-subtitle"></div>

Working with a media queue on the receiver

The only way to create a media queue is as described in Create and load media queue items, above. There is no way to create a media queue from the receiver.

To work with a media queue on the receiver, you have the following MediaManager methods:


See Debugging.