在 AR Foundation Android 应用中使用“原始深度”功能
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
Raw Depth API 可为相机图像提供深度数据,这些数据的准确性高于完整 Depth API 数据,但并不总是涵盖每个像素。原始深度图像及其匹配的置信度图像还可以进一步处理,以便应用仅使用对其具体用例而言准确性足够高的深度数据。
设备兼容性
所有支持 Depth API 的设备均支持原始深度。与完整的 Depth API 一样,Raw Depth API 不需要受支持的硬件深度传感器,例如飞行时间 (ToF) 传感器。不过,Raw Depth API 和完整 Depth API 都会使用设备可能具有的任何受支持的硬件传感器。
原始深度 API 与完整深度 API
Raw Depth API 提供更准确的深度估算值,但原始深度图像可能不包含相机图像中所有像素的深度估算值。相比之下,完整的 Depth API 会为每个像素提供深度估算值,但由于深度估算值经过了平滑和插值处理,因此每个像素的深度数据可能不太准确。这两个 API 中的深度图像的格式和大小相同。只有内容不同。
下表使用厨房中椅子和桌子的图片说明了原始深度 API 与完整深度 API 之间的区别。
API |
返回 |
相机图片 |
深度图像 |
“充满信心”图片 |
Raw Depth API
|
- 原始深度图像,其中包含对相机图像中部分(但不是全部)像素的非常准确的深度估算值。
- 置信度图像,用于提供每个原始深度图像像素的置信度。没有深度估测的相机图像像素的置信度为零。
|
|
|
|
完整深度 API
|
- 单个“平滑”深度图像,其中包含每个像素的深度估测值。
- 此 API 不提供置信度图片。
|
|
|
不适用
|
信心图片
在 Raw Depth API 返回的置信度图片中,颜色越浅的像素具有越高的置信度值,其中白色像素表示完全置信,黑色像素表示无置信。一般来说,相机图片中纹理较多的区域(例如树木)的原始深度置信度会高于纹理较少的区域(例如空白墙面)。没有纹理的表面通常会产生 0 的置信度。
如果目标设备具有受支持的硬件深度传感器,则图片中距离摄像头足够近的区域的置信度可能会更高,即使在没有纹理的表面上也是如此。
计算费用
原始深度 API 的计算费用约为完整深度 API 的计算费用的一半。
使用场景
借助 Raw Depth API,您可以获取深度图片,以更详细地表示场景中对象的几何图形。在创建增强现实体验时,如果需要提高几何图形理解任务的深度准确性和细节,原始深度数据会很有用。一些用例包括:
前提条件
在继续操作之前,请确保您了解基本 AR 概念以及如何配置 ARCore 会话。
启用深度
在新的 ARCore 会话中,检查用户的设备是否支持 Depth。由于处理能力限制,并非所有 ARCore 兼容设备都支持 Depth API。为节省资源,深度默认在 ARCore 上处于停用状态。启用深度模式,以便您的应用使用 Depth API。
var occlusionManager = // Typically acquired from the Camera game object.
// Check whether the user's device supports the Depth API.
if (occlusionManager.descriptor?.supportsEnvironmentDepthImage)
{
// If depth mode is available on the user's device, perform
// the steps you want here.
}
获取最新的原始深度图像
调用 AROcclusionManager.TryAcquireEnvironmentDepthCpuImage()
并使用 AROcclusionManager.environmentDepthTemporalSmoothingRequested
在 CPU 上获取最新的原始深度图像。
获取最新的原始深度置信度图像
调用 AROcclusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage()
并使用 AROcclusionManager.environmentDepthTemporalSmoothingRequested
在 CPU 上获取置信度图像。
// Attempt to get the latest environment depth image.
if (occlusionManager && occlusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage(out XRCpuImage image))
{
using (image)
{
UpdateRawImage(m_RawEnvironmentDepthConfidenceImage, image);
}
}
else
{
m_RawEnvironmentDepthConfidenceImage.enabled = false;
}
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2025-07-26。
[null,null,["最后更新时间 (UTC):2025-07-26。"],[[["\u003cp\u003eThe Raw Depth API delivers higher accuracy depth data compared to the full Depth API, but may not cover every pixel in the camera image.\u003c/p\u003e\n"],["\u003cp\u003eRaw depth images come with corresponding confidence images, allowing developers to filter data based on accuracy for specific use cases.\u003c/p\u003e\n"],["\u003cp\u003eConfidence images use lighter pixels to indicate higher confidence levels, with white representing full confidence and black representing no confidence.\u003c/p\u003e\n"],["\u003cp\u003eWhile requiring less computational power than the full Depth API, the Raw Depth API is ideal for applications like 3D reconstruction, measurement, and shape detection that need precise depth information.\u003c/p\u003e\n"],["\u003cp\u003eThis API is supported by all devices compatible with the Depth API and leverages any available hardware depth sensors for enhanced performance.\u003c/p\u003e\n"]]],["The Raw Depth API provides highly accurate depth data for some pixels in a camera image, unlike the full Depth API, which estimates depth for all pixels with less accuracy. The Raw Depth API also returns a confidence image, indicating the accuracy of each pixel's depth. Both APIs use the same image format and size. Raw Depth is computationally less expensive, and is suitable for applications like 3D reconstruction and measurement, but must be enabled in the ARCore session after checking device compatibility, and can be acquired via `AROcclusionManager.TryAcquireEnvironmentDepthCpuImage()` and `AROcclusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage()`\n"],null,["# Use Raw Depth in your AR Foundation Android app\n\nThe Raw Depth API provides depth data for a camera image that has higher accuracy than full Depth API data, but does not always cover every pixel. Raw depth images, along with their matching confidence images, can also be further processed, allowing apps to use only the depth data that has sufficient accuracy for their individual use case.\n\nDevice compatibility\n--------------------\n\nRaw Depth is available on all [devices that support the Depth API](/ar/devices). **The Raw Depth API, like the full Depth API, does not require a supported hardware depth sensor, such as a time-of-flight (ToF) sensor.** However, both the Raw Depth API and the full Depth API make use of any supported hardware sensors that a device may have.\n\nRaw Depth API vs full Depth API\n-------------------------------\n\nThe Raw Depth API provides depth estimates with higher accuracy, but raw depth images may not include depth estimates for all pixels in the camera image. In contrast, the full Depth API provides estimated depth for every pixel, but per-pixel depth data may be less accurate due to smoothing and interpolation of depth estimates. **The format and size of depth images are the same across both APIs.** Only the content differs.\n\nThe following table illustrates the differences between the Raw Depth API and the full Depth API using an image of a chair and a table in a kitchen.\n\n| API | Returns | Camera image | Depth image | Confidence image |\n|----------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|-------------|------------------|\n| Raw Depth API | - A raw depth image that contains a very accurate depth estimate for some, but not all, pixels in the camera image. - A confidence image that gives the confidence for every raw depth image pixel. Camera image pixels that do not have a depth estimate have a confidence of zero. | | | |\n| Full Depth API | - A single \"smoothed\" depth image that contains a depth estimate for every pixel. - No confidence image is provided with this API. | | | N/A |\n\n### Confidence images\n\nIn confidence images returned by the Raw Depth API, lighter pixels have higher confidence values, with white pixels representing full confidence and black pixels representing no confidence. In general, regions in the camera image that have more texture, such as a tree, will have higher raw depth confidence than regions that don't, such as a blank wall. Surfaces with no texture usually yield a confidence of zero.\n\nIf the target device has a supported hardware depth sensor, confidence in areas of the image close enough to the camera will likely be higher, even on textureless surfaces.\n\n### Compute cost\n\nThe compute cost of the Raw Depth API is about half of the compute cost for the full Depth API.\n\nUse cases\n---------\n\nWith the Raw Depth API, you can obtain depth images that provide a more detailed representation of the geometry of the objects in the scene. Raw depth data can be useful when creating AR experiences where increased depth accuracy and detail are needed for geometry-understanding tasks. Some use cases include:\n\n- 3D reconstruction\n- Measurement\n- Shape detection\n\nPrerequisites\n-------------\n\nMake sure that you understand [fundamental AR concepts](/ar/develop/fundamentals)\nand how to [configure an ARCore session](/ar/develop/unity-arf/session-config) before proceeding.\n\nEnable Depth\n------------\n\nIn a [new ARCore session](/ar/develop/unity-arf/session-config), check whether a user's device supports Depth. Not all ARCore-compatible devices support the Depth API due to processing power constraints. To save resources, depth is disabled by default on ARCore. Enable depth mode to have your app use the Depth API. \n\n var occlusionManager = // Typically acquired from the Camera game object.\n\n // Check whether the user's device supports the Depth API.\n if (occlusionManager.descriptor?.supportsEnvironmentDepthImage)\n {\n // If depth mode is available on the user's device, perform\n // the steps you want here.\n }\n\nAcquire the latest raw depth image\n----------------------------------\n\nCall [`AROcclusionManager.TryAcquireEnvironmentDepthCpuImage()`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_TryAcquireEnvironmentDepthCpuImage_UnityEngine_XR_ARSubsystems_XRCpuImage__) and use [`AROcclusionManager.environmentDepthTemporalSmoothingRequested`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_environmentDepthTemporalSmoothingRequested) to acquire the latest raw depth image on the CPU.\n\nAcquire the latest raw depth confidence image\n---------------------------------------------\n\nCall [`AROcclusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage()`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_TryAcquireEnvironmentDepthConfidenceCpuImage_UnityEngine_XR_ARSubsystems_XRCpuImage__) and use [`AROcclusionManager.environmentDepthTemporalSmoothingRequested`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_environmentDepthTemporalSmoothingRequested) to acquire the confidence image on the CPU. \n\n // Attempt to get the latest environment depth image.\n if (occlusionManager && occlusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage(out XRCpuImage image))\n {\n using (image)\n {\n UpdateRawImage(m_RawEnvironmentDepthConfidenceImage, image);\n }\n }\n else\n {\n m_RawEnvironmentDepthConfidenceImage.enabled = false;\n }"]]