Explore the Object Manipulation sample app code

To see good examples of recommended UX best practices for manipulating objects, dive into the ARCore SDK for Unity Object Manipulation sample app code.

The operations covered in the sample include:

  • Manipulating objects to select them
  • Moving objects around the scene
  • Rotating objects
  • Changing the size of an object
  • Elevating an object

As you explore this sample, install and run the ARCore Elements demo app on your Android phone. This app is designed for AR developers and designers to showcase UX principles and design patterns. It covers these topics featured visually as four islands:

  • User environment
  • User interface
  • User movement
  • Object manipulation

You can also try this out in the AR Elements sample app in the Play Store.

Prerequisites

This guide assumes that you have downloaded and installed the ARCore SDK for Unity. If you are new to the SDK, see the ARCore SDK for Unity quickstart guide first. It's also strongly recommended to take a look at the HelloAR Sample App first.

Tour the sample app

Take a look at the SDK components in the Object Manipulation sample scene.

  1. In the Unity Project window, navigate to Assets > GoogleARCore > Examples > ObjectManipulation > Scenes > ObjectManipulation.
  2. Double-click the ObjectManipulation scene to open it.
  3. Use the Hierarchy window to start navigating the sample scene. You'll find many of the same game objects that are present in HelloAR, but also the following ARCore game objects:
Game Object Description
Manipulation System Prefab that performs the gesture detection and notifies all manipulators in the scene about them. This is a singleton that an application using Object Manipulation must include only once in the scene.
Andy Generator

Game object with a manipulator script that places Andy objects in the AR scene.

The manipulator script has references to the following game objects:
  • First Person Camera
  • Prefabs for dynamically creating game objects:
    • Manipulator: Prefab that implements the desired behavior for responding to gestures from the user.
    • Andy: Prefab that users can place in the AR scene.
Example Controller Game object with a controller script that manages the AR scene. The controller script manages the application lifecycle in a way similar to the HelloAR Example Controller. The difference is the Andy Generator now detects taps from the user and handles the object placement.

Explore the code

Now that you've got an idea of the main ARCore game objects in the sample scene, step through the code that makes them work together.

The AndyPlacementManipulator script implements the placement of objects in the AR scene. AndyPlacementManipulator extends the Manipulator class, which implements the desired behavior in response to a user gesture -- in this case, a tap over a plane to place an Andy in that position.

Access the code

  1. In the ObjectManipulation scene, click the Andy Generator game object.
  2. In the Inspector window, double-click the AndyPlacementManipulator script to open it in the editor.

Step through the code

Take a look at the code. In the AndyPlacementManipulator script, you'll see that the class inherits from the Manipulator base class, which will receive callbacks from the ManipulationSystem when the different gestures from the user are detected. In these callbacks, a Manipulator can decide if the gesture can be started, what to do while it progresses, and when it ends, by overriding different methods from the base class.

In this case, we want to place an Andy when the user taps on a plane:

Decide whether a manipulation can be started for a given gesture

To start the manipulation when the user taps over an area that does not contain other objects, the method CanStartManipulationForGesture(TapGesture gesture) must be overridden:

protected override bool CanStartManipulationForGesture(TapGesture gesture)
{
    if (gesture.TargetObject == null)
    {
        return true;
    }

    return false;
}

To avoid placing an object when the user taps over an existing object (because that means the user wants to select it), the gesture can be started only if the target object of the gesture is null.

Implement behavior during the gesture lifecycle

Besides the callback to decide if the manipulation can be started for a given gesture type, the Manipulator base class provides callbacks to implement different behaviors during each step of the gesture's lifecycle: OnStartManipulation(), OnContinueManipulation(), and OnEndManipulation(). In this case, an object is instantiated from the prefab when the TapGesture ends:

protected override void OnEndManipulation(TapGesture gesture)
{
    if (gesture.WasCancelled)
    {
        return;
    }

    // If gesture is targeting an existing object, we are done.
    if (gesture.TargetObject != null)
    {
        return;
    }

    // Raycast against the location the player touched to search for planes.
    TrackableHit hit;
    TrackableHitFlags raycastFilter = TrackableHitFlags.PlaneWithinPolygon;

    if (Frame.Raycast(gesture.StartPosition.x, gesture.StartPosition.y, raycastFilter, out hit))
    {
        // Use hit pose and camera pose to check if hit test is from the
        // back of the plane, if it is, no need to create the anchor.
        if ((hit.Trackable is DetectedPlane) &&
            Vector3.Dot(FirstPersonCamera.transform.position - hit.Pose.position,
                hit.Pose.rotation * Vector3.up) < 0)
        {
            Debug.Log("Hit at back of the current DetectedPlane");
        }
        else
        {
            // Instantiate Andy model at the hit pose.
            var andyObject = Instantiate(AndyPrefab, hit.Pose.position, hit.Pose.rotation);

            // Instantiate manipulator.
            var manipulator = Instantiate(ManipulatorPrefab, hit.Pose.position, hit.Pose.rotation);

            // Make Andy model a child of the manipulator.
            andyObject.transform.parent = manipulator.transform;

            // Create an anchor to allow ARCore to track the hitpoint as understanding of the physical
            // world evolves.
            var anchor = hit.Trackable.CreateAnchor(hit.Pose);

            // Make manipulator a child of the anchor.
            manipulator.transform.parent = anchor.transform;

            // Select the placed object.
            manipulator.GetComponent<Manipulator>().Select();
        }
    }
}

The AndyPrefab is instantiated as the child of a ManipulatorPrefab. This prefab contains several Manipulator scripts which extend the base Manipulator class, overriding the callback functions mentioned in the previous bullet for different gesture types to implement the different behaviors to which Andy is expected to react: SelectionManipulator, TranslationManipulator, ScaleManipulator, RotationManipulator and ElevationManipulator. Some parameters for these scripts can be customized by opening the Manipulator prefab in the Unity Inspector window; to do this go to the Unity Project window, navigate to Assets > GoogleARCore > Examples > ObjectManipulation > Prefabs, and open the Manipulator prefab.

The object that was just instantiated is set as selected so it responds to other gestures, such as pinching to scale it. At most, one instance of the Manipulator prefab can be selected at a given time, and it will be affected by the user's gestures to scale, rotate, or elevate it.

Visualize the different manipulation states

  1. In the Unity Project window, navigate to Assets > GoogleARCore > Examples > ObjectManipulation > Prefabs.
  2. Click the Manipulator prefab to open it in the Unity Inspector window. This prefab includes a Manipulator game object, which contains all the Manipulator script components for the different behaviors, and two child game objects: Selection Visualization and Elevation Visualization. These two game objects are referenced by the SelectionManipulator and ElevationManipulator script components respectively in the Manipulator game object. The Selection Visualization will be enabled while the object is selected, and the Elevation Visualization will be enabled while the elevation manipulation is active.

Besides the gesture lifecycle callbacks, Manipulator scripts receive callbacks to indicate when the object becomes selected or deselected. The SelectionManipulator script shows an example of this, where these callbacks are used to enable and disable the Selection Visualization:

protected override void OnSelected()
{
    SelectionVisualization.SetActive(true);
}

protected override void OnDeselected()
{
    SelectionVisualization.SetActive(false);
}

フィードバックを送信...