The Lighting Estimation API analyzes a given image for discrete visual cues and provides detailed information about the lighting in a given scene. You can then use this information when rendering virtual objects to light them under the same conditions as the scene they're placed in, making these objects feel more realistic and enhancing the immersive experience for users.
Lighting cues and concepts
Humans unconsciously perceive subtle cues regarding how objects or living things are lit in their environment. When a virtual object is missing a shadow or has a shiny material that doesn't reflect the surrounding space, users can sense the object doesn't quite fit into a particular scene even if they can't explain why. This is why rendering AR objects to match the lighting in a scene is crucial for immersive and more realistic experiences.
Lighting Estimation does most of the work for you by providing detailed data that lets you mimic various lighting cues when rendering virtual objects. These cues are shadows, ambient light, shading, specular highlights, and reflections.
We can describe these visual cues like this:
Ambient light. Ambient light is the overall diffuse light that comes in from around the environment, lighting everything.
Shadows. Shadows are often directional and tell viewers where light sources are coming from.
Shading. Shading is the intensity of the light in different areas of a given image. For example, different parts of the same object can have different levels of shading in the same scene depending on angle relative to the viewer, and its proximity to a light source.
Specular highlights. These are the shiny bits of surfaces that reflect a light source directly. Highlights on an object change relative to the position of a viewer in a scene.
Reflection. Light bounces off of surfaces differently depending on whether the surface has specular (that is, highly reflective) or diffuse (not reflective) properties. For example, a metallic ball will be highly specular and reflect its environment, while another ball painted a dull matte grey will be diffuse. Most real-world objects have a combination of these properties -- think of a scuffed-up bowling ball or a well-used credit card.
Reflective surfaces also pick up colors from the ambient environment. The coloring of an object can be directly affected by the coloring of its environment. For example, a white ball in a blue room will take on a bluish hue.
Using Lighting Estimation modes to enhance realism
ARCore provides three distinct Lighting Estimation modes, providing different
levels realism and runtime performance. Use
ArLightEstimationMode to select the
- Environmental HDR (
AR_LIGHT_ESTIMATION_MODE_ENVIRONMENTAL_HDR) mode. This mode consists of separate APIs that allow for granular and realistic lighting estimation for directional lighting, shadows, specular highlights, and reflections.
Ambient Intensity (
AR_LIGHT_ESTIMATION_MODE_AMBIENT_INTENSITY) mode. This mode determines the average pixel intensity and the color of the lighting for a given image. It's a coarse setting designed for use cases in which precise lighting is not critical, such as objects that have baked-in lighting.
AR_LIGHT_ESTIMATION_MODE_DISABLED). Disable Lighting Estimation if matching AR object and scene lighting to the environment is not critical, or when runtime performance is critical.
When Lighting Estimation is enabled, only significant changes to lighting or positioning of the device trigger updated effects.
Environmental HDR mode
Environmental HDR mode uses machine learning to analyze the camera images in real time and synthesize environmental lighting to support realistic rendering virtual objects.
This lighting estimation mode provides:
- Main directional light. Represents the main light source. Can be used to cast shadows.
- Ambient spherical harmonics. Represents the remaining ambient light energy in the scene.
- HDR cubemap. Can be used to render reflections in shiny metallic objects.
You can use these APIs in different combinations, but they're designed to be used together for the most realistic effect.
Main directional light
The main directional light API calculates the direction
ArLightEstimate_getEnvironmentalHdrMainLightDirection()) and intensity
ArLightEstimate_getEnvironmentalHdrMainLightIntensity()) of the
scene's main light source.
This information allows virtual objects in your scene to show reasonably positioned specular highlights, and to cast shadows in a direction consistent with other visible real objects.
To see how this works, consider these two images of the same virtual rocket. In the image on the left, there's a shadow under the rocket but its direction doesn't match the other shadows in the scene. In the rocket on the right, the shadow points in the correct direction. It's a subtle but important difference, and it grounds the rocket in the scene because the direction and intensity of the shadow better match other shadows in the scene.
When the main light source or a lit object is in motion, the specular highlight on the object adjusts its position in real time relative to the light source.
Directional shadows also adjust their length and direction relative to the position of the main light source, just as they do in the real world. To illustrate this effect, consider these two mannequins, one virtual and the other real.
(The mannequin on the left is the virtual one.)
Ambient spherical harmonics
In addition to the light energy in the main directional light, ARCore provides
the overall ambient light coming in from all directions in the scene. Use
this information during rendering to add subtle cues that bring out the
definition of virtual objects.
To illustrate this effect, consider these two images of the same rocket model. The rocket on the left is rendered using lighting estimation information detected by the main directional light API. The rocket on the right is rendered using information detected by both the main direction light and ambient spherical harmonics APIs. The second rocket clearly has more visual definition, and blends more seamlessly into the scene.
To achieve ideal lighting conditions, use this API with the main directional light and the HDR cubemap.
To render realistic reflections on virtual objects with medium to high
glossiness, such as shiny metallic surfaces, use the HDR cubemap
The HDR cubemap also affects the shading and appearance of objects. For example, the material of a specular object surrounded by a blue environment will reflect blue hues.
Calculating the HDR cubemap requires a small amount of additional CPU computation.
Whether the material of a given surface is specular or diffuse determines how it reflects its surroundings, and therefore whether the HDR cubemap is worth using. Because our virtual rocket's material is metallic, it has a strong specular component that directly reflects the environment around it and therefore benefits from the cubemap. On the other hand, a virtual object with a dull grey matte material doesn't have a specular component at all; its color primarily depends on the diffuse component and it wouldn't benefit from a cubemap.
All three of Environmental HDR APIs were used to render the rocket below. The HDR cubemap enables the reflective cues and further highlighting that ground the object fully in the scene.
Enabling the HDR cubemap automatically enables all of the APIs in
Other examples of Environmental HDR mode in action
Here is the same rocket model in differently lit environments. All of these scenes were rendered using information from all three APIs, with directional shadows applied.
Ambient Intensity mode
AR_LIGHT_ESTIMATION_MODE_AMBIENT_INTENSITY mode determines the average pixel intensity and the
color correction scalars for a given image. It's a coarse setting designed for
use cases in which precise lighting is not critical, such as objects that have
Pixel intensity (
ArLightEstimate_getPixelIntensity()). Captures the average pixel intensity of the lighting in a scene. You can apply this lighting to a whole virtual object.
ArLightEstimate_getColorCorrection()). Detects the white balance for each individual frame. You can then color correct a virtual object so that it integrates more smoothly into the overall coloring of the scene.
- Use the developer guide to implement lighting on your models.