Google is committed to advancing racial equity for Black communities. See how.

Buffering camera frames

Added in ARCore 1.17.0

Most apps only need to buffer a single camera frame for rendering. However, ARCore also supports buffering multiple camera frames in a fixed-size round-robin texture queue.

Apps with a multithreaded rendering pipeline can use buffering to help improve rendering performance. Buffering can also help apps render frames at a more consistent frame rate, which can reduce visual stutter caused by slow UI rendering.

Preconditions

Camera image buffering becomes enabled under any of the following conditions:

  1. When enabled by the app by calling Session.setCameraTextureNames(ids) with multiple texture IDs.

  2. When one or more ARCore features that require internal buffering are enabled. Currently this includes these features:

    • Augmented Images
    • Augmented Faces
  3. On certain ARCore supported devices that require internal buffering to operate correctly.

Determine whether frame delay is enabled

To determine whether a given device and AR session have frame buffering (also known as frame delay) enabled, use the adb logcat output:

# Camera frame buffering is turned off when frame delay is zero.
adb logcat | grep 'Update Frame Delay'
… I native  : session.cc:3141 Update Frame Delay to 0 frames.
# Camera frame buffering is turned on when frame count is non-zero.
# Note: The size of the buffer can vary over time.
adb logcat | grep 'Update Frame Delay'
… I native  : session.cc:3141 Update Frame Delay to 6 frames.

… I native  : session.cc:3141 Update Frame Delay to 4 frames.

… I native  : session.cc:3141 Update Frame Delay to 2 frames.

Memory

Each additional camera frame buffered increases memory utilization. For example, a 1080p texture consumes approximately 6 MB of memory (obtained by multiplying a resolution of 1920 x 1080 by three bytes RGB data per pixel).

Performance considerations

Using multiple camera frames is not guaranteed to decrease variability of frame rates, and using a multithreaded rendering pipeline is not guaranteed to provide better performance in all circumstances. The following factors affect real-world performance:

  • The app rendering pipeline

  • The app threading model

  • The device CPU architecture

  • The operating system scheduling system

If your app is unable to take advantage of additional camera frames, there is no performance advantage in using more than one texture.

Enable buffering

To instruct ARCore which textures to use to buffer incoming camera frames, use Session.setCameraTextureNames(ids) to provide an array of one or more texture IDs. This function is called only after session creation, and is called usually only once.

During each call to Session.update(), ARCore overwrites the next texture in the queue in a round-robin sequence. If your app sets only a single texture ID, the same texture will be overwritten each time.

Use Frame.getCameraTextureName() to determine the texture ID associated with the current frame.