Google is committed to advancing racial equity for Black communities. See how.

Depth API developer guide for AR Foundation for Android

Learn how to use the Depth API in your own apps.

Depth API-supported devices

Only devices that are depth-supported should be able to discover depth-required apps in the Google Play Store. Discovery should be restricted to depth-supported devices when:

  • A core part of the experience relies on depth
  • There is no graceful fallback for the parts of the app that use depth

To restrict distribution of your app in the Google Play Store to devices that support the Depth API, add the following line to your AndroidManifest.xml:

    <uses-feature android:name="com.google.ar.core.depth" />

Check if Depth API is supported

Create a new session and check whether a user's device supports the Depth API.

    var occlusionManager = // Typically acquired from the Camera game object.

    // Check whether the user's device supports the Depth API.
    if (occlusionManager.descriptor?.supportsEnvironmentDepthImage)
    {
        // If depth mode is available on the user's device, perform
        // the steps you want here.
    }

Retrieve depth maps

Get the latest environment depth image from the AROcclusionManager.

    var occlusionManager = // Typically acquired from the AR Camera gameobject.

    if (occlusionManager.TryAcquireEnvironmentDepthCpuImage(out XRCpuImage image))
    {
        using (image)
        {
            // Use the texture.
        }
    }

You can convert the raw CPU image into a RawImage for greater flexibility. An example for how to do this can be found in Unity's ARFoundation samples.

Occlude virtual objects and visualize depth data

Check out Unity's blog post for a high-level overview of depth data and how it can be used to occlude virtual images. Additionally, Unity's ARFoundation samples demonstrate occluding virtual images and visualizing depth data.

Alternative uses of the Depth API

Other uses for the Depth API include:

  • Collisions: virtual objects bouncing off of real world objects
  • Distance measurement
  • Re-lighting a scene
  • Re-texturing existing objects: turning a floor into lava
  • Depth-of-field effects: blurring out the background or foreground
  • Environmental effects: fog, rain, and snow

For more detail and best practices regarding applying occlusion in shaders, check out Unity's ARFoundation samples and the environmentDepthTexture.

Understanding depth values

Given point A on the observed real-world geometry and a 2D point a representing the same point in the depth image, the value given by the Depth API at a is equal to the length of CA projected onto the principal axis. This can also be referred as the z-coordinate of A relative to the camera origin C. When working with the Depth API, it is important to understand that the depth values are not the length of the ray CA itself, but the projection of it.