Remote Display

The Google Cast remote display API builds upon existing screen mirroring technologies that Cast already supports. With the remote display APIs, your sender app can now render directly to any Cast receiver, like a Chromecast. The remote display on the receiver can present information that is entirely different than the information presented in the display on the sender device.

This capability allows you to run native games on the sender device, as well as native presentation applications (for example, a slide show application), and cast them to a receiver device. With the remote display API, graphics-intensive, high-bandwidth, CPU-gobbling native apps can be cast to the big screen.

Compatibility

Supported devices and systems are as follows:

  • Android version 4.4 (API level 19) and later
  • iOS version 8 and later
  • These devices have been tested to work best with Google Cast. Other Android devices might not have the necessary power to provide an optimized Google Cast experience.

Remote display is not currently supported in Chrome.

Considerations

  • For Android, audio does not stream while the app is in the background.
  • For iOS, there is no way to initiate a background service for remote display, and applications may only request up to 3 minutes of processing time while in the background. An iOS app cannot run on a remote display while it is in the background.

Android development

First, follow the guide, Android Sender App Development to get started with building your app for remote display. Follow the procedures in that document up to, but not including, Launching the receiver before following the procedures in this guide.

Implementing the remote display APIs follows the pattern for implementing a Presentation in Android. The Context of a Presentation is different from the context of its containing Activity or Service. It is important to inflate the layout of a Presentation and load other resources using the Presentation's own context to ensure that assets of the correct size and density for the target display are loaded.

To view the sample Java source code, see the CastRemoteDisplay-android sample.

Create the connection

Add the Cast button to the Activity using the MediaRouteActionProvider or MediaRouteButton. This is described in Adding the Cast button.

Use the MediaRouteSelector to find the route for your remote display application ID. You obtain the application ID when you register your application for remote display. See Registration for information about registering your app. Here is an example of how to select the route for your application ID:

MediaRouteSelector mMediaRouteSelector = new MediaRouteSelector.Builder()
  .addControlCategory( CastMediaControlIntent.categoryForCast(REMOTE_DISPLAY_APP_ID))
  .build();

Run the remote display service

The Cast remote display session must be initiated by a focused activity.

After the user has selected a route and the MediaRouter onRouteSelected callback is invoked, start an instance of CastRemoteDisplayLocalService as follows:

Intent intent = new Intent(CastRemoteDisplayActivity.this,
          CastRemoteDisplayActivity.class);
  intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP | Intent.FLAG_ACTIVITY_SINGLE_TOP);
  PendingIntent notificationPendingIntent = PendingIntent.getActivity(
          CastRemoteDisplayActivity.this, 0, intent, 0);

  CastRemoteDisplayLocalService.NotificationSettings settings =
          new CastRemoteDisplayLocalService.NotificationSettings.Builder()
                  .setNotificationPendingIntent(notificationPendingIntent).build();

CastRemoteDisplayLocalService.startService(
  getApplicationContext(),
  PresentationService.class, REMOTE_DISPLAY_APP_ID,
     mSelectedDevice, settings,
     new CastRemoteDisplayLocalService.Callbacks() {
       @Override
       public void onServiceCreated(
                       CastRemoteDisplayLocalService service) {
         Log.d(TAG, "onServiceCreated");
       }

       @Override
       public void onRemoteDisplaySessionStarted(
                       CastRemoteDisplayLocalService service) {
         // initialize sender UI
       }

       @Override
       public void onRemoteDisplaySessionError(
         Status errorReason){
           initError();
       }
});

This service keeps the remote display running/alive even when the app goes into the background.

The PresentationService class extends CastRemoteDisplayLocalService. Declare your service in the app manifest as follows:

<service android:name=".PresentationService" android:exported="false" />

Remember to terminate the CastRemoteDisplayLocalService in an onDestroy call when you're finished with it.

Create the presentation

The CastRemoteDisplayLocalService uses the CastRemoteDisplayApi to create an instance of android.app.Presentation using the selected route's Display and rendering the remote display as the content view of the Presentation instance. In your class that extends CastRemoteDisplayLocalService, implement the onCreatePresentation and onDismissPresentation methods to create and dismiss your presentation.

@Override
public void onCreatePresentation(Display display) {
  createPresentation(display);
}

@Override
public void onDismissPresentation() {
  dismissPresentation();
}

Extend the CastPresentation class:

private final static class FirstScreenPresentation extends CastPresentation {

  private GLSurfaceView mFirstScreenSurfaceView;

  public FirstScreenPresentation(Context context,
                                 Display display) {
    super(context, display);
  }

  @Override
  protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.first_screen_layout);
    mFirstScreenSurfaceView = (GLSurfaceView)
          findViewById(R.id.surface_view);
    mFirstScreenSurfaceView.setRenderer(new CubeRenderer(false));
  }
}

The CastRemoteDisplayLocalService can then invoke your overridden methods to create and dismiss your CastPresentation instance:

private void dismissPresentation() {
  if (mPresentation != null) {
    mPresentation.dismiss();
    mPresentation = null;
  }
}

private void createPresentation(Display display) {
  dismissPresentation();
  mPresentation = new FirstScreenPresentation(this, display);
  try {
    mPresentation.show();
  } catch (WindowManager.InvalidDisplayException ex) {
    Log.e(TAG, "Unable to show presentation, display was " +
               "removed.", ex);
    dismissPresentation();
  }
}

Audio is captured at the system level using REMOTE_SUBMIX.

When the user disconnects from the route, you can stop the remote display, in the onRouteUnselected callback, as follows:

@Override
public void onRouteUnselected(MediaRouter router, RouteInfo info) {
  ...
  CastRemoteDisplayLocalService.stopService();
  ...
}

iOS Development

First, follow the guide, iOS Sender App Development to get started with building your app for remote display. Follow the procedures in that document up to, but not including, Launch application before following the procedures in this guide.

Connect your app

As described in Launch application, once your sender app is connected to the receiver, you can launch your application. You pass in the application ID that you received when you register your application. The following example launches the application:

class RemoteDisplayManager: NSObject,
                            GCKDeviceScannerListener,
                            GCKDeviceManagerDelegate,
                            GCKRemoteDisplayChannelDelegate {
  let appId = "YOUR APP ID"

  var deviceManager: GCKDeviceManager!
  var sessionId: String!
  var channel: GCKRemoteDisplayChannel!

  func connect(device: GCKDevice) {
    deviceManager = GCKDeviceManager(
        device: device,
        clientPackageName: NSBundle.mainBundle().bundleIdentifier,
        ignoreAppStateNotifications: true)
    deviceManager.delegate = self
    deviceManager.connect()
  }

  func disconnect() {
    deviceManager.stopApplicationWithSessionID(sessionId)
    deviceManager.disconnectWithLeave(true)
  }

  func deviceManagerDidConnect(deviceManager: GCKDeviceManager!) {
    let options = GCKLaunchOptions(relaunchIfRunning: true)
    deviceManager.launchApplication(appId, withLaunchOptions: options)
  }

  ...
}

Start a remote display session

Use GCKRemoteDisplayChannel to begin a new remote display session, as in the following example. You can also customize the configuration (resolution, frame rate, etc.) here.

class RemoteDisplayManager: NSObject,
                            GCKDeviceScannerListener,
                            GCKDeviceManagerDelegate,
                            GCKRemoteDisplayChannelDelegate {
  ...

  func deviceManager(deviceManager: GCKDeviceManager!,
                     didConnectToCastApplication applicationMetadata:
                     GCKApplicationMetadata!,
                     sessionID: String!,
                     launchedApplication: Bool) {
    self.sessionId = sessionID
    channel = GCKRemoteDisplayChannel()
    channel.delegate = self
    deviceManager.addChannel(channel)
  }

  func remoteDisplayChannelDidConnect(channel: GCKRemoteDisplayChannel) {
    let configuration = GCKRemoteDisplayConfiguration()
    // Customize the configuration as needed.
    channel.beginSessionWithConfiguration(configuration, error: nil)
  }

  func remoteDisplayChannel(channel: GCKRemoteDisplayChannel,
      didBeginSession session: GCKRemoteDisplaySession) {
    // Use the session.
  }
}

Rendering to the remote display session

You can redirect your app's audio stream by installing a tap on your audio engine's final mix and setting the samples to zero after the session copies them. Ideally you should integrate remote display audio in a way that leverages the core audio high priority thread.

Use frame inputs and the session to send video frames. Create a frame input object matching your rendering API. Frame inputs are adapters that make it easy to integrate your rendering code with remote display. The general flow is to render to a texture instead of a system surface. Once you have your remote display frame encoded, hand over the texture to the frame input for processing.

You do not have to send video or audio frames if they are not needed. This is a great way to save power and bandwidth. For example, if a game is in a paused state with no animations, submit the paused frame once then stop until the game resumes.

The following code example demonstrates the basic scaffolding for integrating remote display with a renderer:

class RemoteDisplayRenderer {
  let device = { MTLCreateSystemDefaultDevice() }()

  var frameInput: GCKMetalVideoFrameInput!
  var remoteFrame: MTLTexture!
  var session: GCKRemoteDisplaySession! {
    didSet {
      frameInput = GCKMetalVideoFrameInput(session: session)
      frameInput.device = device
      loadRemoteDisplay()
    }
  }

  func loadRemoteDisplay() {
    let width = Int(frameInput.width)
    let height = Int(frameInput.height)
    let texDesc = MTLTextureDescriptor.texture2DDescriptorWithPixelFormat(
        .BGRA8Unorm, width: width, height: height, mipmapped: false)
    remoteFrame = device.newTextureWithDescriptor(texDesc)
  }

  func renderRemoteDisplay(commandBuffer: MTLCommandBuffer) {
    let renderPassDesc = MTLRenderPassDescriptor()
    renderPassDesc.colorAttachments[0].texture = remoteFrame
    let renderEncoder = commandBuffer.renderCommandEncoderWithDescriptor(renderPassDesc)!
    // ...
    renderEncoder.endEncoding()
    frameInput.encodeFrame(remoteFrame, commandBuffer: commandBuffer)
  }
}

Registration

You must register your app to support Google Cast remote display, using the Google Cast SDK Developer Console. See Registration for more information about registering your app.

In the Google Cast SDK Developer Console, you add a new application, selecting Remote Display Receiver for the application type.

Your application will be assigned an application ID. You provide the application ID in the API to start the remote display.