November 6, 2019 update:
  • There's a new open source Cardboard SDK for iOS and Android NDK that offers a streamlined API, improved device compatibility, and built-in viewer profile QR code scanning. A corresponding Unity package (SDK) is planned for a future release. We recommend that all developers actively building for Google Cardboard migrate (iOS, Android NDK) to the new Cardboard SDK.
October 15, 2019 update:
  • The Daydream View VR headset is no longer available for purchase. However, you can continue to use the existing Google VR SDK to update and distribute your apps to the Google Play Store, and make them available to users in the Daydream app.

Unreal Motion Controller Support

Daydream offers motion controller support for Unity and Unreal. These features include:

  • Controller visualization: A 3D model of the Daydream controller that displays which button the user is currently pressing and where the user is currently touching the Daydream controller's touchpad.

  • Laser and reticle visualization: Displays a laser and reticle so the user can easily interact with the VR environment.

  • Arm model: A mathematical model to make the 3D controller model in VR approximate the physical location of the Daydream controller.

  • Input System: A standard and extensible framework for raycasting from the controller model. The input system integrates with the laser and reticle visualization to make it easy to interact with the UI and other objects in VR.

All visualization elements are optional and reskinnable.

Controller support in Unreal

Currently this functionality is only available in Unreal with Google VR.

Motion Controller with visualization support

  1. Enable the Google VR Motion Controller plugin. (instructions).
  2. Open the Blueprint for the Player Pawn.
  3. Add the GoogleVRMotionController to the Components list at the same level as the VR Camera root.
  4. Modify the properties on the GoogleVRMotionController Component to adjust it.

Cardboard apps should use UGoogleVRGazeReticleComponent instead, for a gaze-based reticle.

Motion Controller without visualization support

Use the official Unreal MotionControllerComponent.

Input system

  1. Open the Blueprint for the Player Pawn.
  2. Add the GoogleVRPointerInput Component to the Blueprint.
  3. Use the GoogleVRPointerInput Component's API to listen and react to events triggered by the pointer.
  4. If desired, subclass the GoogleVRPointerInput Component in C++ to add additional events, to add custom processing of the raycast, or to override the raycast implementation.

The GoogleVrPointerInput Component works with both GoogleVRGazeReticle and GoogleVRMotionController. It is also integrated with UE4 Widgets, allowing you to interact with the standard UE4 UI with the pointer.

To respond to events generated by the GoogleVRPointerInput Component, use the interfaces IGoogleVRActorPointerResponder and IGoogleVRComponentPointerResponder in either C++ or Blueprint.

Adjusting the Arm Model

Blueprint:

  1. Open your Player Pawn Blueprint.
  2. Create a node, and search for the term "ArmModel” to see what tuning parameters are available.
  3. Attach the node to the BeginPlay event.

C++

  1. Add #include "GoogleVRControllerFunctionLibrary.h" to your code.
  2. Include GoogleVRController as a dependency in your Build.cs file.
  3. Call tuning functions, for example:

     UGoogleVRControllerFunctionLibrary::SetArmModelPointerTiltAngle(20.0f);
    

Disabling the Arm Model

You can disable or enable the Arm Model by calling the function SetArmModelEnabled either in a Blueprint or in code as described in the “Adjusting the Arm Model” section of this document. If disabled, the MotionControllerComponent will behave the same as it did the previous version of Unreal, in that orientation will change based on the controller.