Google VR SDK for Unity

The Google VR SDK for Unity includes scripts and prefabs to make developing Daydream and Cardboard apps easier. The Google VR SDK for Unity requires Unity 5.6 or newer.

This guide describes how to use the Google VR SDK, as well as some things to consider while creating a VR application in Unity.

Enable stereo rendering

First, make sure that Android is the selected build platform. Next, go to Player Settings and enable Virtual Reality Supported. In the new Virtual Reality SDKs section, select the + icon then select Daydream and/or Cardboard depending on the app you're making.

If you are targeting iOS and using a different version of Unity, use the Google VR SDK for Unity to enable stereo rendering. Add the GvrEditorEmulator prefab to your scene, or attach the GvrViewer script to an existing object.

Simulated head tracking in the Unity Editor

Add the prefab GvrEditorEmulator to your scene. Then in Play mode, you can use mouse movements to move the camera in the same way as head-tracking. To simulate turning your head, press and hold the Alt key and move your mouse. To tilt your view, press and hold the Control key and move the mouse.

Simulated controller in the Unity Editor

If you're building a Daydream application, you can use the Controller Emulator to emulate the Daydream controller in Play mode.

Alternatively, you can use the following controls to simulate the daydream controller with the keyboard and mouse:

  • Shift + Move Mouse = Change Orientation
  • Shift + Left Mouse Button = ClickButton
  • Shift + Right Mouse Button = AppButton
  • Shift + Middle Mouse Button = HomeButton/Recenter
  • Shift + Ctrl = IsTouching
  • Shift + Ctrl + Move Mouse = Change TouchPos

If you're building a Cardboard application, click the mouse in Play mode to simulate the trigger of a Cardboard viewer.

Stereo rendering in the Unity editor

Stereo rendering cannot be previewed in the Unity editor. However, you can test stereo rendering without building to device by using Instant Preview.

Daydream controller support

Daydream applications use the Daydream controller for input. Add the GvrControllerMain prefab to your scene and implement the Controller API to use the controller. If you use the controller like a laser pointer, the SDK has tools to help you. Read on to learn about the Google VR Pointer system.

User interaction

Users interact with VR applications differently than standard phone apps. Users can't tap specific parts of a phone screen when it's in a VR viewer. UI elements must be rendered in stereo and placed within the virtual space to be legible by users. With head tracking, UI and other interactions can be placed all around the user, not just directly in front of them.

The Google VR Pointer system

The Google VR SDK provides a Pointer system for implementing consistent user interaction in Daydream and Cardboard. The Pointer system consists of an Input Module, raycasters, and UI prefabs. To enable it, add the GvrEventSystem prefab to your scene.

Input module

The GvrPointerInputModule script is an Input Module for Unity's Event System which selects GameObjects based on the user's gaze or controller.


The Pointer system includes two scripts, GvrPointerGraphicRaycaster and GvrPointerPhysicsRaycaster, which are replacements for the standard Unity Graphic Raycaster and Physics Raycaster respectively. They are located in the Assets/GoogleVR/Scripts/EventSystem directory.

Use GvrPointerGraphicRaycaster to interact with UI elements by attaching it to a Canvas, like a normal Graphic Raycaster.

Use GvrPointerPhysicsRaycaster to interact with 3D objects in the scene. This script should be attached to the main camera. You must also have a script on each interactive object to respond to the generated events. You could use an EventTrigger or implement some of the standard Unity Event interfaces in your scripts.


The Pointer system includes prefabs which provide a consistent interface for selecting things in VR. Daydream apps should use the GvrControllerPointer prefab, while Cardboard apps should use the GvrReticlePointer prefab. If you add one to your scene, the Pointer system will automatically connect them to any GVR raycasters in your scene. They will now interact with the Unity Event System through the GvrPointerInputModule.


The GvrControllerPointer prefab implements a "laser pointer" controller that complies with the Daydream app quality design requirements including an arm model, rendering the controller in the scene, and displaying a laser ray. To use it, add the prefab as a sibling of the main camera. You may need to create an empty game object and add both the main camera and the GvrControllerPointer prefab as children.

The prefab is highly customizable. You can change the look of the controller and laser, or use tooltips to explain how to use the controller with your app.


The GvrReticlePointer prefab adds a reticle for gaze-based interaction. It draws a dot which expands to a circle whenever you are gazing at something responsive to the GVR raycaster(s) you've added. To use it, add the prefab as a child of the main camera.

Spatial Audio in Unity

The Google VR SDK lets you add spatial audio to your application. In Unity, this is done using a few prefabs. For more information, see our guide on spatial audio in Unity.