<GCKRemoteDisplaySession> Protocol

<GCKRemoteDisplaySession> Protocol Reference

Overview

Represents an established session and provides methods for sending frames.

A Remote Display session manages all the encoding and networking operations necessary to send frames to the Google Cast receiver. Sessions are created via an instance of GCKRemoteDisplayChannel, which handles session parameter negotiation with the Cast receiver. The documentation for GCKRemoteDisplayChannel goes into the details of this process.

Remote Display sessions are active as soon as they are created. Cast receivers will wait a short while after session establishment for frames to start arriving, after which they will terminate the session.

Sending video frames

Video frames represented by a CVPixelBuffer object are enqueued for encoding and transmission by calling enqueueVideoFrame:pts: (GCKRemoteDisplaySession-p). Only '420v' and 'y420' buffers are supported. The session will retain each video frame buffer as long as needed (which is typically until a frame has been compressed).

The hardware video encoder can allocate video frame buffers with special affinity that will eliminate video copies and format conversions. If at all possible, use createPixelBuffer (GCKRemoteDisplaySession-p) to acquire such a buffer and write a video frame directly into it.

Using frame inputs

To faciliate the integration of Remote Display with the major iOS rendering frameworks, the SDK provides video frame inputs. They are adaptors that provide optimized frame processing pipelines from the rendering framework to a Remote Display session. See the GCKVideoFrameInput documentation for details.

See also
GCKMetalVideoFrameInput
GCKOpenGLESVideoFrameInput
GCKViewVideoFrameInput

Sending audio frames

Audio frames represented by a AudioBufferList structure are enqueued for encoding and transmission by calling enqueueAudioBuffer:frames:pts: (GCKRemoteDisplaySession-p). The session will copy the contents of each audio buffer into one or more internal buffers for encoding and transmission. Samples that do not fit into available buffering memory are dropped, so care should be taken to avoid enqueuing too many samples at once.

The audio buffer must contain floating point deinterleaved samples matching the session's channel count and sample rate. If this is not the case, behavior is undefined (the app may crash, audio may be garbled, etc). Use GCKRemoteDisplayAudioInput to enqueue audio buffers in a different format.

See also
GCKRemoteDisplayAudioInput

AV synchronization

Audio and video frames are synchronized on the receiver using their presentation timestamp. Remote Display sessions use the mach_absolute_time clock (also known as host time).

Teardown

A GCKDeviceManager must be used to detect when the Remote Display receiver app exits in order to gracefully tear down a session. Remote Display sessions are connectionless and therefore cannot detect that the remote end is gone on their own.

Backgrounding

Because a Remote Display session is connectionless, it may be resumed after getting interrupted by app backgrounding.

See also
Resuming after app backgrounding

Inherits <NSObject>.

Instance Method Summary

(nullable CVPixelBufferRef) - createPixelBuffer
 Create a '420v' buffer with encoder affinity. More...
 
(nullable id
< GCKRemoteDisplayAudioBuffer >) 
- newAudioBuffer
 Create a standard format audio buffer with session affinity. More...
 
(void) - enqueueVideoFrame:pts:
 Enqueue a video frame for encoding and transmission. More...
 
(void) - enqueueAudioBuffer:frames:pts:
 Enqueue an audio buffer for encoding and transmission. More...
 
(void) - enqueueAudioBuffer:pts:
 Enqueue an audio buffer for encoding and transmission. More...
 

Properties

GCKRemoteDisplayConfigurationconfiguration
 The configuration object that was used to initialize the session. More...
 
GCKRemoteDisplayTargetDelay targetDelay
 Used to change on the fly the delay used for the presentation of video and audio to give enough time to play video and audio without pausing. More...
 

Method Detail

- (nullable CVPixelBufferRef) createPixelBuffer

Create a '420v' buffer with encoder affinity.

Can return null if no buffer is available.

- (nullable id<GCKRemoteDisplayAudioBuffer>) newAudioBuffer

Create a standard format audio buffer with session affinity.

If buffers obtained via this method are not eventually enqueued, the session will run out of buffers and no audio will be transmitted.

- (void) enqueueVideoFrame: (CVPixelBufferRef)  frame
pts: (CFTimeInterval)  pts 

Enqueue a video frame for encoding and transmission.

The frame is retained until it has been encoded. The frame must use the '420v' or 'y420' formats. The timestamp must be in host time (CACurrentMediaTime, CADisplayLink.timestamp, CMClockGetHostTimeClock, etc).

- (void) enqueueAudioBuffer: (const AudioBufferList *)  abl
frames: (uint32_t)  frames
pts: (const struct AudioTimeStamp *)  pts 

Enqueue an audio buffer for encoding and transmission.

The audio samples are copied synchronously to internal buffers. The timestamp must have a valid mHostTime field.

The audio buffer must contain floating point deinterleaved samples matching the session's channel count and sample rate. If this is not the case, behavior is undefined (the app may crash, audio may be garbled, etc). Use GCKRemoteDisplayAudioInput to enqueue audio buffers in a different format to the session.

See also
GCKRemoteDisplayAudioInput
- (void) enqueueAudioBuffer: (id< GCKRemoteDisplayAudioBuffer >)  buffer
pts: (const struct AudioTimeStamp *)  pts 

Enqueue an audio buffer for encoding and transmission.

If the audio buffer was created via -newAudioBuffer, then the buffer's internal memory is "stolen" instead of copying the samples and the buffer will no longer contain any samples after the method returns. Otherwise, the audio samples are copied synchronously to internal buffers. Any object that can conform at runtime to the GCKRemoteDisplayAudioBuffer protocol will work, including AVAudioPCMBuffer.

The timestamp must have a valid mHostTime field.

The audio buffer must contain floating point deinterleaved samples matching the session's channel count and sample rate. If this is not the case, behavior is undefined (the app may crash, audio may be garbled, etc). Use GCKRemoteDisplayAudioInput to enqueue audio buffers in a different format to the session.

See also
GCKRemoteDisplayAudioInput

Property Documentation

- (GCKRemoteDisplayConfiguration*) configuration
readnonatomicassign

The configuration object that was used to initialize the session.

- (GCKRemoteDisplayTargetDelay) targetDelay
readwritenonatomicassign

Used to change on the fly the delay used for the presentation of video and audio to give enough time to play video and audio without pausing.

Useful to temporarily increase the delay to present non-interactive content.

Defaults to remote display configuration's target delay.

See also
GCKRemoteDisplayConfiguration.targetDelay

Google Cast iOS Sender API Reference v 2.10.1 4691