Java Support Library Tutorial

This page describes how to use features of the Tango Support Library in Java. The Tango Support Library Java reference pages describe the available functions in more detail.

Installation instructions


If you followed our instructions in Getting Started with the Tango Java API, then you already have the Java Support Library on your machine.

Follow the instructions to add the Tango Support Library for Java to your project. Make sure to include support-base as a dependency.

If you use your own build system, download and include the Java Support Library package in your project.

Support library tutorial

The Support Library contains functions to help you manage and use data from the Tango API. Current features of the library include plane fitting to depth data.

Initializing the Support Library

Before you can use the Support Library, you must initialize it by calling TangoSupport.initialize().

Depth perception plane fitting

A common use case in augmented reality (AR) applications is placing a virtual object on real world surfaces like walls, floors, tabletops, etc. This uses a combination of Depth Perception and Motion Tracking. The TangoSupport.fitPlaneModelNearClick helper function fits a plane model to the depth data near a specified screen location, and returns the plane equation and the intersection point with the camera ray.

Initialize the Support Library:


This example fits a plane to the depth data located at the center of the screen:

IntersectionPointPlaneModelPair pair =
  TangoSupport.fitPlaneModelNearClick(pointCloud, pointCloudTranslation,
                                      pointCloudOrientation, 0.5f, 0.5f, displayRotation,
                                      colorCameraTranslation, colorCameraOrientation);

The pointCloudTranslation and pointCloudOrientation inputs are the transform from the desired output frame to the color camera frame when the uv coordinates are chosen. The pointCloudTranslation and pointCloudOrientation inputs are the transform from the desired output frame to the point cloud at the time the point cloud was acquired. The returned IntersectionPlaneModel use this desired coordinate system. The PlaneFittingExample on GitHub demonstrates how to generate the inputs and transform the outputs into world coordinates.

See the sample and the reference documents for further details on providing the depth data, intrinsics, and pose parameters.

Depth data with interpolation

There are sometimes gaps in the depth data coming from the sensor. The Support Library for Java provides an interpolation algorithm to try to fill these gaps when querying for depth at a particular point in the image.

Nearest neighbor interpolation - single point

This example attempts to find a depth reading near the center of the camera image:

float u = 0.5f;
float v = 0.5f;
float[] colorCameraPoint;

try {
    colorCameraPoint =
    TangoSupport.getDepthAtPointNearestNeighbor(pointCloud, pointCloudTranslation,
                                                pointCloudOrientation, u, v,
                                                displayRotation, colorCameraTranslation,
} catch (TangoInvalidException exception) {
    Log.e(TAG, "Error finding depth at the point.", exception);

if (colorCameraPoint == null) {
    Log.w(TAG, "No point cloud point is sufficiently close.");

Enviar comentarios sobre…