Using ARCore to light models in a scene

The Lighting Estimation API analyzes a given image for discrete visual cues and provides detailed information about the lighting in a given scene. You can then use this information when rendering virtual objects to light them under the same conditions as the scene they're placed in, making these objects feel more realistic and enhancing the immersive experience for users.

Lighting cues and concepts

Humans unconsciously perceive subtle cues regarding how objects or living things are lit in their environment. When a virtual object is missing a shadow or has a shiny material that doesn't reflect the surrounding space, users can sense the object doesn't quite fit into a particular scene even if they can't explain why. This is why rendering AR objects to match the lighting in a scene is crucial for immersive and more realistic experiences.

Lighting Estimation does most of the work for you by providing detailed data that lets you mimic various lighting cues when rendering virtual objects. These cues are shadows, ambient light, shading, specular highlights, and reflections.

We can describe these visual cues like this:

  • Ambient light. Ambient light is the overall diffuse light that comes in from around the environment, lighting everything.

  • Shadows. Shadows are often directional and tell viewers where light sources are coming from.

  • Shading. Shading is the intensity of the light in different areas of a given image. For example, different parts of the same object can have different levels of shading in the same scene depending on angle relative to the viewer, and its proximity to a light source.

  • Specular highlights. These are the shiny bits of surfaces that reflect a light source directly. Highlights on an object change relative to the position of a viewer in a scene.

  • Reflection. Light bounces off of surfaces differently depending on whether the surface has specular (that is, highly reflective) or diffuse (not reflective) properties. For example, a metallic ball will be highly specular and reflect its environment, while another ball painted a dull matte grey will be diffuse. Most real-world objects have a combination of these properties -- think of a scuffed-up bowling ball or a well-used credit card.

    Reflective surfaces also pick up colors from the ambient environment. The coloring of an object can be directly affected by the coloring of its environment. For example, a white ball in a blue room will take on a bluish hue.

Using Lighting Estimation modes to enhance realism

The ArLightEstimationMode API has modes that estimate lighting in the environment with different degrees of granularity and realism.

  • Environmental HDR mode (AR_LIGHT_ESTIMATION_MODE_ENVIRONMENTAL_HDR). This mode consists of separate APIs that allow for granular and realistic lighting estimation for directional lighting, shadows, specular highlights, and reflections.
  • Ambient Intensity mode (AR_LIGHT_ESTIMATION_MODE_AMBIENT_INTENSITY). This mode determines the average pixel intensity and the color of the lighting for a given image. It's a coarse setting designed for use cases in which precise lighting is not critical, such as objects that have baked-in lighting.

  • AR_LIGHT_ESTIMATION_MODE_DISABLED. Disable ArLightEstimationMode if lighting to match a given environment is not relevant for a scene or an object.

Using Environmental HDR mode

Environmental HDR mode uses machine learning to analyze the input camera image and synthesize environmental lighting for rendering a virtual object.

This mode has three APIs that provide information about different kinds of lighting in the environment: a main directional light, ambient spherical harmonics, and an HDR cubemap. You can use these APIs in different combinations, but they're designed to be used together for the most realistic effect.

Main directional light

The main directional light API calculates the direction and intensity of a given scene's main light source, which you can obtain using ArLightEstimate_getEnvironmentalHdrMainLightDirection() and ArLightEstimate_getEnvironmentalHdrMainLightIntensity().

This information allows virtual objects in your scene to show reasonably positioned specular highlights, and to cast shadows in a direction consistent with other visible real objects.

To see how this works, consider these two images of the same virtual rocket. In the image on the left, there's a shadow under the rocket but its direction doesn't match the other shadows in the scene. In the rocket on the right, the shadow points in the correct direction. It's a subtle but important difference, and it grounds the rocket in the scene because the direction and intensity of the shadow better match other shadows in the scene.

     

When the main light source or a lit object is in motion, the specular highlight on the object adjusts its position in real time relative to the light source.

Directional shadows also adjust their length and direction relative to the position of the main light source, just as they do in the real world. To illustrate this effect, consider these two mannequins, one virtual and the other real.

(The mannequin on the left is the virtual one.)

Ambient spherical harmonics

Using ArLightEstimate_getEnvironmentalHdrAmbientSphericalHarmonics(), ARCore can get a realistic representation of the overall ambient light coming in from all directions in a scene. You can then use this information during rendering to add subtle cues that bring out the definition of virtual objects.

To illustrate this effect, consider these two images of the same rocket model. The rocket on the left is rendered using lighting estimation information detected by the main directional light API. The rocket on the right is rendered using information detected by both the main direction light and ambient spherical harmonics APIs. The second rocket clearly has more visual definition, and blends more seamlessly into the scene.

     

To achieve ideal lighting conditions, use this API with the main directional light and the HDR cubemap.

HDR cubemap

The HDR cubemap API detects lighting estimation data that helps you render realistic reflections to virtual objects in a scene.

When you use this API, ArLightEstimate_acquireEnvironmentalHdrCubemap() gets a cubemap that captures the environment lighting surrounding the virtual object. During rendering, this cubemap creates the reflection for the medium to high glossiness material.

In this image, all three of the Environmental HDR APIs have been enabled prior to rendering the object. The HDR cubemap enables the reflective cues and further highlighting that ground the object fully in the scene.

Using the cubemap adds a minimal increase to processing. Whether the material of a given surface is specular or diffuse determines how it reflects its surroundings, and therefore whether the HDR cubemap is worth using. Because our virtual rocket's material is metallic, it has a strong specular component that directly reflects the environment around it and therefore benefits from the cubemap. On the other hand, a virtual object with a dull grey matte material doesn't have a specular component at all; its color primilarly depends on the diffuse component and it wouldn't benefit from a cubemap.

The cubemap also affects the shading and appearance of the object. For example, the material of a specular object surrounded by a blue environment will reflect blue hues.

Enabling the HDR cubemap automatically enables all of the APIs in AR_LIGHT_ESTIMATION_MODE_ENVIRONMENTAL_HDR mode.

Other examples of AR_LIGHT_ESTIMATION_MODE_ENVIRONMENTAL_HDR in action

Here is the same rocket model in differently lit environments. All of these scenes were rendered using information from all three APIs, with directional shadows applied.

           

Using AR_LIGHT_ESTIMATION_MODE_AMBIENT_INTENSITY mode

AR_LIGHT_ESTIMATION_MODE_AMBIENT_INTENSITY mode determines the average pixel intensity and the color correction scalars for a given image. It's a coarse setting designed for use cases in which precise lighting is not critical, such as objects that have baked-in lighting.

  • Pixel intensity. When you use this API, ArLightEstimate_getPixelIntensity() captures the average pixel intensity of the lighting in a scene. You can apply this lighting to a whole virtual object.

  • Color. When you use this API, ArLightEstimate_getColorCorrection() detects the white balance for each individual frame. You can then color correct a virtual object so that it integrates more smoothly into the overall coloring of a scene.

Next steps