为了确定给定设备和 AR 会话是否进行帧缓冲(还要
称为帧延迟),请使用 adb logcat 输出:
# Camera frame buffering is turned off when frame delay is zero.adblogcat|grep'Update Frame Delay'…Inative:session.cc:3141UpdateFrameDelayto0frames.
# Camera frame buffering is turned on when frame count is non-zero.# Note: The size of the buffer can vary over time.adblogcat|grep'Update Frame Delay'…Inative:session.cc:3141UpdateFrameDelayto6frames.……Inative:session.cc:3141UpdateFrameDelayto4frames.……Inative:session.cc:3141UpdateFrameDelayto2frames.
内存
每增加一个缓冲的相机帧就会增加内存利用率。例如:
一个 1080p 纹理会占用大约 6 MB 的内存(通过将
1920 x 1080 x 3 个字节的 RGB 数据)。
[null,null,["最后更新时间 (UTC):2025-07-26。"],[[["\u003cp\u003eARCore 1.17.0 introduces camera frame buffering, allowing apps to store multiple camera frames in a queue for improved rendering performance.\u003c/p\u003e\n"],["\u003cp\u003eBuffering can be enabled by the app, required by specific ARCore features (like Augmented Images or Augmented Faces), or automatically enabled on certain devices.\u003c/p\u003e\n"],["\u003cp\u003eEach buffered frame increases memory consumption; for example, a 1080p texture requires ~6MB of memory.\u003c/p\u003e\n"],["\u003cp\u003ePerformance improvements from buffering and multithreaded rendering are not guaranteed and depend on factors like the app's pipeline, device, and OS.\u003c/p\u003e\n"],["\u003cp\u003eTo enable buffering, use \u003ccode\u003eSession.setCameraTextureNames(ids)\u003c/code\u003e to provide texture IDs, and \u003ccode\u003eFrame.getCameraTextureName()\u003c/code\u003e to retrieve the ID for the current frame.\u003c/p\u003e\n"]]],[],null,["# Buffering camera frames\n\n**Added in ARCore 1.17.0**\n\nMost apps only need to buffer a single camera frame for rendering. However,\nARCore also supports buffering multiple camera frames in a fixed-size\nround-robin texture queue.\n\nApps with a multithreaded rendering pipeline can use buffering to help improve\nrendering performance. Buffering can also help apps render frames at a more\nconsistent frame rate, which can reduce visual stutter caused by\n[slow UI rendering](https://developer.android.com/topic/performance/vitals/render).\n\nPreconditions\n-------------\n\nCamera image buffering becomes enabled under *any* of the following conditions:\n\n1. **When enabled by the app** by calling [`Session.setCameraTextureNames(ids)`](/ar/reference/java/com/google/ar/core/Session#setCameraTextureNames-textureIds)\n with multiple texture IDs.\n\n2. **When one or more ARCore features that require internal buffering are\n enabled**. Currently this includes these features:\n\n - Augmented Images\n - Augmented Faces\n3. **On certain ARCore supported devices** that require internal buffering to\n operate correctly.\n\nDetermine whether frame delay is enabled\n----------------------------------------\n\nTo determine whether a given device and AR session have frame buffering (also\nknown as frame delay) enabled, use the `adb` logcat output: \n\n # Camera frame buffering is turned off when frame delay is zero.\n adb logcat | grep 'Update Frame Delay'\n ... I native : session.cc:3141 Update Frame Delay to 0 frames.\n\n # Camera frame buffering is turned on when frame count is non-zero.\n # Note: The size of the buffer can vary over time.\n adb logcat | grep 'Update Frame Delay'\n ... I native : session.cc:3141 Update Frame Delay to 6 frames.\n ...\n ... I native : session.cc:3141 Update Frame Delay to 4 frames.\n ...\n ... I native : session.cc:3141 Update Frame Delay to 2 frames.\n\nMemory\n------\n\nEach additional camera frame buffered increases memory utilization. For example,\na 1080p texture consumes approximately 6 MB of memory (obtained by multiplying a\nresolution of 1920 x 1080 by three bytes RGB data per pixel).\n\nPerformance considerations\n--------------------------\n\n| **Caution:** If you decide to use a multithreaded pipeline, make sure to test production builds (not debug builds of your app) on the same devices that your users will use.\n\nUsing multiple camera frames is not guaranteed to decrease variability of\nframe rates, and using a multithreaded rendering pipeline is not guaranteed to\nprovide better performance in all circumstances. The following factors affect\nreal-world performance:\n\n- The app rendering pipeline\n\n- The app threading model\n\n- The device CPU architecture\n\n- The operating system scheduling system\n\nIf your app is unable to take advantage of additional camera frames, there is no\nperformance advantage in using more than one texture.\n\nEnable buffering\n----------------\n\nTo instruct ARCore which textures to use to buffer incoming camera frames, use\n[`Session.setCameraTextureNames(ids)`](/ar/reference/java/com/google/ar/core/Session#setCameraTextureNames-textureIds) to provide an array of one or more texture IDs.\nThis function is called only after session creation, and is called usually only\nonce.\n\nDuring each call to [`Session.update()`](/ar/reference/java/com/google/ar/core/Session#update-), ARCore overwrites the next texture in\nthe queue in a round-robin sequence. If your app sets only a single texture ID,\nthe same texture will be overwritten each time.\n\nUse [`Frame.getCameraTextureName()`](/ar/reference/java/com/google/ar/core/Frame#getCameraTextureName-) to determine the texture ID associated\nwith the current frame."]]