Google is committed to advancing racial equity for Black communities. See how.

Support for buffering multiple camera frames

Added in ARCore 1.17.0

Most apps need to buffer only a single camera frame for rendering. ARCore also supports buffering sequential camera frames in a fixed-size texture queue, which supports the creation of multithreaded rendering pipelines.

Buffering multiple camera frames can help your app render frames at a more consistent frame rate, which can reduce visual stutter caused by slow UI rendering. Each additional camera frame increases memory. For example, a 1080p texture consumes approximately 6 MB of memory. (In this example, 6 MB is obtained by multiplying a resolution of 1920 x 1080 by three bytes RGB data per pixel.)

If your app is unable to take advantage of additional camera frames, there is no performance advantage in using more than one texture.

Performance considerations

Using multiple camera frames is not guaranteed to decrease variability of frame rates, and using a multithreaded rendering pipeline is not guaranteed to provide better performance in all circumstances. The following factors affect real-world performance:

  • The app rendering pipeline

  • The app threading model

  • The device CPU architecture

  • The operating system scheduling system

Use ARCore to buffer camera frames

To instruct ARCore which textures to use to buffer incoming camera frames, use Session.setCameraTextureNames(ids) to provide an array of one or more texture IDs. This function is called only after session creation, and is called usually only once.

During each call to Session.update(), ARCore overwrites the next texture in the queue in a round-robin sequence. If your app sets only a single texture ID, the same texture will be overwritten each time.

Use Frame.getCameraTextureName() to determine the texture ID associated with the current frame.