Get Started

This document describes how to use the Google VR SDK for iOS (iOS SDK) to create your own Virtual Reality (VR) experiences.

You can use a VR viewer, such as Google Cardboard, to turn your smartphone into a VR platform. Your phone can display 3D scenes with binocular rendering, track and react to head movements, and interact with apps by activating the trigger input.

The iOS SDK contains tools for spatial audio that go far beyond simple left side/right side audio cues to offer 360 degrees of sound. You can also control the tonal quality of the sound—for example, you can make a conversation in a small spaceship sound drastically different than one in a large, underground (and still virtual) cave.

The demo app used in this tutorial, "Treasure Hunt," is a basic game, but it demonstrates the core features of the Google VR SDK. In the game, users look around a virtual world to find and collect objects. It demonstrates some basic features, such as lighting, movement in space, and coloring. It shows how to set up the trigger input, detect if the user is looking at something, set up spatial audio, and render images by providing a different view for each eye.

Before you begin

To build the demo app, you must have the following:

  • Xcode 7.1 or higher.

  • CocoaPods. To download and install, go to cocoapods.org.

  • A physical iPhone running iOS 7 or higher.

Download and build the app

  1. Clone the demo app from the GitHub repository by running the following command:

    git clone https://github.com/googlevr/gvr-ios-sdk.git

  2. In the Terminal, navigate to the TreasureHunt folder, and then run the following:

    pod update

    The CardboardSDK CocoaPod is added to the TreasureHunt project.

  3. In Xcode, open the TreasureHunt.xcworkspace project and then click the Run button.

    Here is a screenshot of Xcode with the TreasureHunt.xcworkspace project loaded:

You're ready to use the Google VR SDK for iOS!

Play the game

The goal of the game is to find the cubes in 3D space and collect them. To enjoy the benefits of spatial audio, be sure to wear headphones.

To find and collect a cube

  1. Move your head in any direction until a cube enters your field of view.
  2. Look directly at the cube. This causes it to turn orange.
  3. Activate the trigger input. This collects the cube.

Overview of the code

The TreasureHunt app renders an OpenGL scene for each eye in VR mode. The sections below provide details about the following tasks:

Implement a UIViewController to host GVRCardboardView

The TreasureHunt app implements a UIViewController, the [TreasureHuntViewController] class, that hosts an instance of the GVRCardboardView class. An instance of the TreasureHuntRenderer class is created and set as a GVRCardboardViewDelegate for the GVRCardboardView. In addition, the app provides a render loop, the TreasureHuntRenderLoop class, that drives the -render method of the GVRCardboardView.

- (void)loadView {
  _treasureHuntRenderer = [[TreasureHuntRenderer alloc] init];
  _treasureHuntRenderer.delegate = self;

  _cardboardView = [[GVRCardboardView alloc] initWithFrame:CGRectZero];
  _cardboardView.delegate = _treasureHuntRenderer;
  ...
  _cardboardView.vrModeEnabled = YES;
  ...
  self.view = _cardboardView;
}

Define a renderer to implement the GVRCardboardViewDelegate protocol

GVRCardboardView provides a drawing surface for your rendering. It coordinates the drawing with your rendering code through the GVRCardboardViewDelegate protocol. To achieve this, the TreasureHuntRenderer class implements GVRCardboardViewDelegate:

#import "GVRCardboardView.h"

/** TreasureHunt renderer. */
@interface TreasureHuntRenderer : NSObject<GVRCardboardViewDelegate>

@end

Implement the GVRCardboardViewDelegate protocol

To draw the GL content onto GVRCardboardView, TreasureHuntRenderer implements the GVRCardboardViewDelegate protocol:

@protocol GVRCardboardViewDelegate<NSObject>

- (void)cardboardView:(GVRCardboardView *)cardboardView
         didFireEvent:(GVRUserEvent)event;

- (void)cardboardView:(GVRCardboardView *)cardboardView
     willStartDrawing:(GVRHeadTransform *)headTransform;

- (void)cardboardView:(GVRCardboardView *)cardboardView
     prepareDrawFrame:(GVRHeadTransform *)headTransform;

- (void)cardboardView:(GVRCardboardView *)cardboardView
              drawEye:(GVREye)eye
    withHeadTransform:(GVRHeadTransform *)headTransform;

- (void)cardboardView:(GVRCardboardView *)cardboardView
   shouldPauseDrawing:(BOOL)pause;

@end

Implementations for the willStartDrawing, prepareDrawFrame, and drawEye methods are described below.

Implement willStartDrawing

To perform one-time GL state initialization, implement -cardboardView:willStartDrawing:. Use this opportunity to load shaders, initialize scene geometry, and bind to GL parameters. We also initialize an instance of the GVRCardboardAudioEngine class here:

- (void)cardboardView:(GVRCardboardView *)cardboardView
     willStartDrawing:(GVRHeadTransform *)headTransform {
  // Load shaders and bind GL attributes.
  // Load mesh and model geometry.
  // Initialize GVRCardboardAudio engine.
  _cardboard_audio_engine =
  [[GVRCardboardAudioEngine alloc]initWithRenderingMode:
      kRenderingModeBinauralHighQuality];
  [_cardboard_audio_engine preloadSoundFile:kSampleFilename];
  [_cardboard_audio_engine start];
  ...
  [self spawnCube];
}

Implement prepareDrawFrame

To set up rendering logic before the individual eyes are rendered, implement -cardboardView:prepareDrawFrame:. Any per-frame operations specific to this rendering should happen here. This is a good place to update your model and clear the GL state for drawing. The app computes the head orientation and updates the audio engine.

- (void)cardboardView:(GVRCardboardView *)cardboardView
     prepareDrawFrame:(GVRHeadTransform *)headTransform {
  GLKMatrix4 head_from_start_matrix = [headTransform headPoseInStartSpace];
  // Update audio listener's head rotation.
  const GLKQuaternion head_rotation =
      GLKQuaternionMakeWithMatrix4(GLKMatrix4Transpose(
      [headTransform headPoseInStartSpace]));
  [_cardboard_audio_engine setHeadRotation:head_rotation.q[0]
                                         y:head_rotation.q[1]
                                         z:head_rotation.q[2]
                                         w:head_rotation.q[3]];
  // Update the audio engine.
  [_cardboard_audio_engine update];

  // Clear the GL viewport.
  glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
  glEnable(GL_DEPTH_TEST);
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
  glEnable(GL_SCISSOR_TEST);
}

Implement drawEye

This is the core of the rendering code, and very similar to building a regular OpenGL ES application. The following snippet shows how to implement drawEye to get the view transformation matrix for each eye and the perspective transformation matrix. Note that this method gets called for each eye. If the GVRCardboardView does not have VR mode enabled, then eye is set to the center eye. This is useful for "mono" rendering, which can be useful to provide a non- VR view of the 3D scene.

- (void)cardboardView:(GVRCardboardView *)cardboardView
              drawEye:(GVREye)eye
    withHeadTransform:(GVRHeadTransform *)headTransform {
  // Set the viewport.
  CGRect viewport = [headTransform viewportForEye:eye];
  glViewport(viewport.origin.x, viewport.origin.y, viewport.size.width,
      viewport.size.height);
  glScissor(viewport.origin.x, viewport.origin.y, viewport.size.width,
      viewport.size.height);

  // Get the head matrix.
  const GLKMatrix4 head_from_start_matrix =
      [headTransform headPoseInStartSpace];

  // Get this eye's matrices.
  GLKMatrix4 projection_matrix = [headTransform
      projectionMatrixForEye:eye near:0.1f far:100.0f];
  GLKMatrix4 eye_from_head_matrix =
      [headTransform eyeFromHeadMatrix:eye];

  // Compute the model view projection matrix.
  GLKMatrix4 model_view_projection_matrix =
      GLKMatrix4Multiply(projection_matrix,
      GLKMatrix4Multiply(eye_from_head_matrix, head_from_start_matrix));

  // Render from this eye.
  [self renderWithModelViewProjectionMatrix:model_view_projection_matrix.m];
}

After returning from this call, GVRCardboardView renders the scene to the display.

The rendering needs to be driven by a render loop using CADisplayLink. The TreasureHunt app provides a sample render loop: TreasureHuntRenderLoop. This needs to call the -render method of the GVRCardboardView class. This is handled in the -viewWillAppear: and - viewDidDisappear: methods of the TreasureHuntViewController class:

- (void)viewWillAppear:(BOOL)animated {
  [super viewWillAppear:animated];

  _renderLoop = [[TreasureHuntRenderLoop alloc]
   initWithRenderTarget:_cardboardView selector:@selector(render)];
}

- (void)viewDidDisappear:(BOOL)animated {
  [super viewDidDisappear:animated];

  [_renderLoop invalidate];
  _renderLoop = nil;
}

Handling inputs

The Google VR SDK detects events that are fired by activating the trigger input. To provide custom behavior when these events occur, implement the -cardboardView:didFireEvent: delegate method.

- (void)cardboardView:(GVRCardboardView *)cardboardView
         didFireEvent:(GVRUserEvent)event {
  switch (event) {
    case kGVRUserEventBackButton:
    // If the view controller is in a navigation stack or
    // over another view controller, pop or dismiss the
    // view controller here.
    break;
    case kGVRUserEventTrigger:
     NSLog(@"User performed trigger action");
     // Check whether the object is found.
     if (_is_cube_focused) {
       // Vibrate the device on success.
       AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
       // Generate the next cube.
       [self spawnCube];
     }
     break;
  }
}