This guide assumes that you have already built and run the Treasure Hunt sample game in our getting started guide. If you haven't, go do that and then come back here for an explanation of the code itself.
Overview of the Treasure Hunt code
Manifest file
All Google VR SDK apps, both Cardboard and Daydream, require a specific set of manifest attributes. These requirements are described in detail in our Android manifest reference.
Extend GvrActivity
GvrActivity
is the
starting point for coding an app using the Google VR SDK. GvrActivity
is the
base activity that provides easy integration with Google VR devices. It exposes
events to interact with the VR environment and handles many of the details
commonly required when creating an activity for VR rendering.
Note that GvrActivity
uses sticky immersive mode, in which the system UI is
hidden, and the content takes up the whole screen. This is a requirement for a
VR app, since GvrView
will only render when the activity is in fullscreen
mode. See
Using Immersive Full-Screen Mode
for more discussion of this feature.
The sample app's MainActivity
extends
GvrActivity
.
MainActivity
implements the following interface:
GvrView.StereoRenderer
: Interface for renderers that delegate all stereoscopic rendering details to the view. Implementors should simply render a view as they would normally do using the provided transformation parameters. All stereoscopic rendering and distortion correction details are abstracted from the renderer and managed internally by the view.
Define a GvrView
All user interface elements in an Android app are built using views. The Google
VR SDK for Android provides its own view,
GvrView
, which can be
used for VR rendering. GvrView
renders content in stereo. You can see how the
sample app defines a GvrView
to in its activity layout xml file in the
following way:
<com.google.vr.sdk.base.GvrView
android:id="@+id/gvr_view"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_alignParentTop="true"
android:layout_alignParentLeft="true" />
Then in the main activity class it initializes the GvrView
in the onCreate()
method:
/**
* Sets the view to our GvrView and initializes the transformation matrices
* we will use to render our scene.
*/
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.common_ui);
GvrView gvrView = (GvrView) findViewById(R.id.gvr_view);
// Associate a GvrView.StereoRenderer with gvrView.
gvrView.setRenderer(this);
// Associate the gvrView with this activity.
setGvrView(gvrView);
// Initialize other objects here.
...
}
Render the view
Once you get the GvrView
you associate it with a renderer, and then you
associate the GvrView
with the activity. Google VR supports two kinds of
renderers, but the quickest way to get started is to use
GvrView.StereoRenderer
, which is what the sample app uses.
GvrView.StereoRenderer
includes these key methods:
-
onNewFrame()
, called every time that app renders. -
onDrawEye()
, called for each eye with different eye parameters.
Implementing these is similar to what you would normally do for an OpenGL application. These methods are discussed in more detail in the following sections.
Implement onNewFrame
Use the onNewFrame()
method to to encode rendering logic before the individual
eyes are rendered. Any per-frame operations not specific to a single view should
happen here. This is a good place to update your model. In this snippet, the
variable mHeadView
contains the position of the head. This value needs to be
saved to use later to tell if the user is looking at a treasure:
/**
* Prepares OpenGL ES before we draw a frame.
* @param headTransform The head transformation in the new frame.
*/
@Override
public void onNewFrame(HeadTransform headTransform) {
...
headTransform.getHeadView(mHeadView, 0);
...
}
Implement onDrawEye
Implement onDrawEye()
to perform per-eye configuration.
This is the meat of the rendering code, and very similar to building a regular
OpenGL ES2 application. The following snippet shows how to get the view
transformation matrix, and also the perspective transformation matrix. You need
to make sure that you render with low latency. The Eye
object contains the
transformation and projection matrices for the eye.
/**
* Draws a frame for an eye.
*
* @param eye The eye to render. Includes all required transformations.
*/
@Override
public void onDrawEye(Eye eye) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
...
// Apply the eye transformation to the camera.
Matrix.multiplyMM(mView, 0, eye.getEyeView(), 0, mCamera, 0);
// Set the position of the light
Matrix.multiplyMV(mLightPosInEyeSpace, 0, mView, 0, LIGHT_POS_IN_WORLD_SPACE, 0);
// Build the ModelView and ModelViewProjection matrices
// for calculating cube position and light.
float[] perspective = eye.getPerspective(Z_NEAR, Z_FAR);
Matrix.multiplyMM(mModelView, 0, mView, 0, mModelCube, 0);
Matrix.multiplyMM(mModelViewProjection, 0, perspective, 0, mModelView, 0);
drawCube();
// Draw the rest of the scene.
...
}
This is the sequence of events:
-
The treasure comes into eye space.
-
We apply the projection matrix. This provides the scene rendered for the specified eye.
-
The Google VR SDK applies distortion automatically, to render the final scene.
Rendering spatial audio
The onCreate()
method initializes the 3D audio engine. The second parameter in
the constructor of GvrAudioEngine
allows the user to specify a rendering mode
defining the spatialization fidelity.
gvrAudioEngine =
new GvrAudioEngine(this, GvrAudioEngine.RenderingMode.BINAURAL_HIGH_QUALITY);
To disable audio when the user pauses the app, and enable it again when they
resume, call gvrAudioEngine.pause();
and gvrAudioEngine.resume();
in the
onPause()
and onResume()
functions respectively. Sound files can be streamed
during playback or preloaded into memory before playback. This preloading should
be performed on a separate thread in order to avoid blocking of the main thread.
new Thread(
new Runnable() {
@Override
public void run() {
gvrAudioEngine.preloadSoundFile(SOUND_FILE);
}
})
.start();
One can create, position, and play back sound objects at any time, using
createSoundObject()
. Any number of sound objects can be created from the same
preloaded sound file. Note that if sounds have not previously been preloaded,
the sound file will be streamed from disk on playback.
// Start spatial audio playback of SOUND_FILE at the model postion. The returned
// sourceId handle allows for repositioning the sound object whenever the cube
// position changes.
sourceId = gvrAudioEngine.createSoundObject(SOUND_FILE);
gvrAudioEngine.setSoundObjectPosition(
sourceId, modelPosition[0], modelPosition[1], modelPosition[2]);
gvrAudioEngine.playSound(sourceId, true /* looped playback */);
The sourceId
handle can be used to reposition the sound during run time.
// Update the sound location to match it with the new cube position.
if (sourceId != GvrAudioEngine.INVALID_ID) {
gvrAudioEngine.setSoundObjectPosition(
sourceId, modelPosition[0], modelPosition[1], modelPosition[2]);
}
In the onNewFrame
method, we get a quaternion representing the latest position
of the user's head, and pass that to setHeadRotation()
to update the
gvrAudioEngine
.
// Update the 3d audio engine with the most recent head rotation.
headTransform.getQuaternion(headRotation, 0);
gvrAudioEngine.setHeadRotation(
headRotation[0], headRotation[1], headRotation[2], headRotation[3]);
Calls to gvrAudioEngine.update()
should be made once per frame.
Handling inputs
Cardboard viewers include a trigger button which uses a touch simulator. When you pull the trigger, the viewer touches your phone's screen. These trigger events are detected by the Google VR SDK for you.
To provide custom behavior when the user pulls the trigger, override
GvrActivity.onCardboardTrigger()
in your app's activity. In the Treasure Hunt
app, for example, when you find a treasure and pull the trigger, the cube moves
to a new place.
/**
* Called when the Cardboard trigger is pulled.
*/
@Override
public void onCardboardTrigger() {
if (isLookingAtObject()) {
hideObject();
}
// Always give user feedback
mVibrator.vibrate(50);
}