Understanding the Treasure Hunt sample game

This guide assumes that you have already built and run the Treasure Hunt sample game in our getting started guide. If you haven't, go do that and then come back here for an explanation of the code itself.

Overview of the Treasure Hunt code

Manifest file

A typical Google VR SDK app should use some variant of the following manifest tags:

<manifest ...
    <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    ...
    <uses-sdk android:minSdkVersion="19" android:targetSdkVersion="24"/>
    ...
    <uses-feature android:glEsVersion="0x00020000" android:required="true" />
    <uses-feature android:name="android.software.vr.mode" android:required="false"/>
    <uses-feature android:name="android.software.vr.high_performance" android:required="false"/>
    ...
    <application
            ...
        <activity
                android:name=".TreasureHuntActivity"
                android:screenOrientation="landscape"
                android:configChanges="orientation|keyboardHidden|screenSize"
                android:enableVrMode="@string/gvr_vr_mode_component"
                android:theme="@style/VrActivityTheme"
                android:resizeableActivity="false">
                ...

            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
                <category android:name="com.google.intent.category.CARDBOARD" />
                <category android:name="com.google.intent.category.DAYDREAM" />
            </intent-filter>
        </activity>
    </application>
</manifest>

Note the following:

  • <uses-sdk android:minSdkVersion="19"/> indicates that the device must be running API Level 19 (KitKat) or higher.

  • <uses-sdk android:targetSdkVersion="24"/> indicates our app is targetting API Level 24 (Nougat).

  • <uses-feature android:glEsVersion="0x00020000" android:required="true" /> indicates that the device must support OpenGL ES 2.0 to run the sample app.

  • <uses-feature android:name="android.software.vr.mode" android:required="false" /> indicates (optional) use of Android N's VR mode, when available.

  • <uses-feature android:name="android.software.vr.high_performance" android:required="false" /> indicates (optional) use of Daydream-ready device features, when available.

  • android:screenOrientation="landscape" indicates that the activity's required screen orientation is "landscape." This is the orientation you must set for VR apps. The view used by the Google VR SDK, GvrView, only renders on fullscreen and landscape modes.

  • The setting android:configChanges="orientation|keyboardHidden|screenSize" is also recommended, but not mandatory.

  • Use of Android N's VR mode is signaled with android:enableVrMode="gvr_vr_mode_component". This attribute will have no effect on older versions of Android.

  • The Google VR SDK-provided VR theme, specified with android:theme="@style/VrActivityTheme", should always be used to ensure a consistent fullscreen experience, particularly when transitioning between Daydream-compatible apps.

  • VR Activities should always be fullscreen, so resizing support should be disabled via android:resizeableActivity="false".

  • android.permission.READ_EXTERNAL_STORAGE. This permission is required by the Google VR SDK to pair the user's phone to their VR viewer. Note that this permission is only strictly necessary when Google VR Services is unavailable.

  • The intent-filter and specifically com.google.intent.category.CARDBOARD and com.google.intent.category.DAYDREAM state that this activity is compatible with both Cardboard- and Daydream-compatible VR viewers. These categories are also used for app compatibility filtering; the CARDBOARD category is used by the Cardboard app, while the DAYDREAM category is used by the Daydream app. Note that Daydream apps must also meet app quality requirements to be listed.

Extend GvrActivity

GvrActivity is the starting point for coding an app using the Google VR SDK. GvrActivity is the base activity that provides easy integration with Google VR devices. It exposes events to interact with the VR environment and handles many of the details commonly required when creating an activity for VR rendering.

Note that GvrActivity uses sticky immersive mode, in which the system UI is hidden, and the content takes up the whole screen. This is a requirement for a VR app, since GvrView will only render when the activity is in fullscreen mode. See Using Immersive Full-Screen Mode for more discussion of this feature.

The sample app's MainActivity extends GvrActivity. MainActivity implements the following interface:

  • GvrView.StereoRenderer: Interface for renderers that delegate all stereoscopic rendering details to the view. Implementors should simply render a view as they would normally do using the provided transformation parameters. All stereoscopic rendering and distortion correction details are abstracted from the renderer and managed internally by the view.

Define a GvrView

All user interface elements in an Android app are built using views. The Google VR SDK for Android provides its own view, GvrView, which can be used for VR rendering. GvrView renders content in stereo. You can see how the sample app defines a GvrView to in its activity layout xml file in the following way:

<com.google.vr.sdk.base.GvrView
    android:id="@+id/gvr_view"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    android:layout_alignParentTop="true"
    android:layout_alignParentLeft="true" />

Then in the main activity class it initializes the GvrView in the onCreate() method:

/**
 * Sets the view to our GvrView and initializes the transformation matrices
 * we will use to render our scene.
 */
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.common_ui);
    GvrView gvrView = (GvrView) findViewById(R.id.gvr_view);
    // Associate a GvrView.StereoRenderer with gvrView.
    gvrView.setRenderer(this);
    // Associate the gvrView with this activity.
    setGvrView(gvrView);

    // Initialize other objects here.
    ...
}

Render the view

Once you get the GvrView you associate it with a renderer, and then you associate the GvrView with the activity. Google VR supports two kinds of renderers, but the quickest way to get started is to use GvrView.StereoRenderer, which is what the sample app uses.

GvrView.StereoRenderer includes these key methods:

  • onNewFrame(), called every time that app renders.

  • onDrawEye(), called for each eye with different eye parameters.

Implementing these is similar to what you would normally do for an OpenGL application. These methods are discussed in more detail in the following sections.

Implement onNewFrame

Use the onNewFrame() method to to encode rendering logic before the individual eyes are rendered. Any per-frame operations not specific to a single view should happen here. This is a good place to update your model. In this snippet, the variable mHeadView contains the position of the head. This value needs to be saved to use later to tell if the user is looking at a treasure:

/**
 * Prepares OpenGL ES before we draw a frame.
 * @param headTransform The head transformation in the new frame.
 */
@Override
public void onNewFrame(HeadTransform headTransform) {
    ...
    headTransform.getHeadView(mHeadView, 0);
    ...
}

Implement onDrawEye

Implement onDrawEye() to perform per-eye configuration.

This is the meat of the rendering code, and very similar to building a regular OpenGL ES2 application. The following snippet shows how to get the view transformation matrix, and also the perspective transformation matrix. You need to make sure that you render with low latency. The Eye object contains the transformation and projection matrices for the eye.

/**
 * Draws a frame for an eye.
 *
 * @param eye The eye to render. Includes all required transformations.
 */
@Override
public void onDrawEye(Eye eye) {
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    ...
    // Apply the eye transformation to the camera.
    Matrix.multiplyMM(mView, 0, eye.getEyeView(), 0, mCamera, 0);

    // Set the position of the light
    Matrix.multiplyMV(mLightPosInEyeSpace, 0, mView, 0, LIGHT_POS_IN_WORLD_SPACE, 0);

    // Build the ModelView and ModelViewProjection matrices
    // for calculating cube position and light.
    float[] perspective = eye.getPerspective(Z_NEAR, Z_FAR);
    Matrix.multiplyMM(mModelView, 0, mView, 0, mModelCube, 0);
    Matrix.multiplyMM(mModelViewProjection, 0, perspective, 0, mModelView, 0);
    drawCube();

    // Draw the rest of the scene.
    ...
}

This is the sequence of events:

  • The treasure comes into eye space.

  • We apply the projection matrix. This provides the scene rendered for the specified eye.

  • The Google VR SDK applies distortion automatically, to render the final scene.

Rendering spatial audio

The onCreate() method initializes the 3D audio engine. The second parameter in the constructor of GvrAudioEngine allows the user to specify a rendering mode defining the spatialization fidelity.

gvrAudioEngine =
    new GvrAudioEngine(this, GvrAudioEngine.RenderingMode.BINAURAL_HIGH_QUALITY);

To disable audio when the user pauses the app, and enable it again when they resume, call gvrAudioEngine.pause(); and gvrAudioEngine.resume(); in the onPause() and onResume() functions respectively. Sound files can be streamed during playback or preloaded into memory before playback. This preloading should be performed on a separate thread in order to avoid blocking of the main thread.

new Thread(
        new Runnable() {
          @Override
          public void run() {
            gvrAudioEngine.preloadSoundFile(SOUND_FILE);
          }
        })
    .start();

One can create, position, and play back sound objects at any time, using createSoundObject(). Any number of sound objects can be created from the same preloaded sound file. Note that if sounds have not previously been preloaded, the sound file will be streamed from disk on playback.

// Start spatial audio playback of SOUND_FILE at the model postion. The returned
// sourceId handle allows for repositioning the sound object whenever the cube
// position changes.
sourceId = gvrAudioEngine.createSoundObject(SOUND_FILE);
gvrAudioEngine.setSoundObjectPosition(
    sourceId, modelPosition[0], modelPosition[1], modelPosition[2]);
gvrAudioEngine.playSound(sourceId, true /* looped playback */);

The sourceId handle can be used to reposition the sound during run time.

// Update the sound location to match it with the new cube position.
if (sourceId != GvrAudioEngine.INVALID_ID) {
  gvrAudioEngine.setSoundObjectPosition(
      sourceId, modelPosition[0], modelPosition[1], modelPosition[2]);
}

In the onNewFrame method, we get a quaternion representing the latest position of the user's head, and pass that to setHeadRotation() to update the gvrAudioEngine.

// Update the 3d audio engine with the most recent head rotation.
headTransform.getQuaternion(headRotation, 0);
gvrAudioEngine.setHeadRotation(
    headRotation[0], headRotation[1], headRotation[2], headRotation[3]);

Calls to gvrAudioEngine.update() should be made once per frame.

Handling inputs

Cardboard viewers include a trigger button which uses a touch simulator. When you pull the trigger, the viewer touches your phone's screen. These trigger events are detected by the Google VR SDK for you.

To provide custom behavior when the user pulls the trigger, override GvrActivity.onCardboardTrigger() in your app's activity. In the Treasure Hunt app, for example, when you find a treasure and pull the trigger, the cube moves to a new place.

/**
 * Called when the Cardboard trigger is pulled.
 */
@Override
public void onCardboardTrigger() {
    if (isLookingAtObject()) {
        hideObject();
    }

    // Always give user feedback
    mVibrator.vibrate(50);
}