AI-generated Key Takeaways
-
The Google VR iOS API provides classes and protocols for building VR experiences on iOS devices.
-
Key components include classes for audio, rendering, head tracking, and panoramic/video playback.
-
Developers can use
GVRCardboardViewto render VR graphics andGVRPanoramaVieworGVRVideoViewfor immersive media experiences. -
Delegate protocols like
GVRCardboardViewDelegateandGVRVideoViewDelegateoffer control over rendering and video playback events. -
GVRWidgetViewand its delegate provide a base for custom interactive elements in VR environments.
Class List
The classes and protocols in Google VR iOS API Reference:
| CGVRAudioEngine | High-level Google VR Audio Engine |
| CGVRCardboardView | Defines a view responsible for rendering graphics in VR mode |
| C<GVRCardboardViewDelegate> | Defines a delegate protocol for GVRCardboardView |
| CGVRFieldOfView | Defines a struct to hold half field of view angles, in degrees, for an GVREye eye |
| CGVRHeadRotation | Contains yaw and pitch angles corresponding to where the user is looking |
| CGVRHeadTransform | Defines a class to represent the head transformation for a render frame |
| CGVRPanoramaView | Defines a view that can load and display 360-degree panoramic photos |
| CGVRVideoView | Defines a player view that renders a 360 video using OpenGL |
| C<GVRVideoViewDelegate> | Defines a protocol to notify delegates of change in video playback |
| CGVRWidgetView | Defines a base class for all widget views, that encapsulates common functionality |
| C<GVRWidgetViewDelegate> | Defines a delegate for GVRWidgetView and its subclasses |