Environment

Physical environments

Design one experience for many different spaces

Give users a clear understanding of the amount of space they’ll need for your app. Can you use it on your lap, a kitchen table, or a football stadium? Show them the ideal conditions for using it. You can include preview graphics in the Play Store, or instructions in the app itself.

It’s still helpful to consider all the places that your app might be used, from a small apartment to a vast field. Prepare for large and small spaces, real-world obstacles like furniture or traffic, and physical challenges.

Public spaces provide their own set of challenges for AR. Tracking and occlusion become difficult, depending on the number of objects and people around. Also, phone movement and AR immersion can be distracting or dangerous.

Virtual environments

An augmented environment combines a real-world image captured from a device’s camera with virtual content, such as digital objects or information.

As your phone moves through the world, ARCore tracks the phone’s position relative to the world around it. This process is called concurrent odometry and mapping, or COM.

ARCore looks at a camera image and detects visually distinct features, called feature points. Then it uses these points to figure out its change in position. The visual information is combined with inertial measurements from the device to estimate the pose (position and orientation) of the camera relative to the world over time.

By aligning the pose of the virtual camera that renders 3D content with the pose of the device's camera, ARCore renders virtual content from the correct perspective. That virtual image is overlaid on top of a live camera image, making the virtual content appear as part of the real world.

Continuous discovery

ARCore is constantly improving its understanding of the real-world environment.

It builds a model of your space, adding to that information as the phone moves around and the camera picks up new parts of the space, and new details about it. ARCore recognizes and clusters feature points that appear to lie on common horizontal and angled surfaces, and makes these surfaces available to your app as planes.

Environmental limitations

For now, limitations that may hinder accurate understanding of surfaces include:

  • Flat surfaces without texture, such as a white desk
  • Environments with dim lighting
  • Extremely bright environments
  • Transparent or reflective surfaces like glass
  • Dynamic or moving surfaces, such as blades of grass or ripples in water

When users encounter environmental limitations, indicate what went wrong and point them in the right direction.