在 AR 基金會 Android 應用程式中使用原始深度
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
Raw Depth API 會為相機圖片提供深度資料,其準確度高於完整 Depth API 資料,但不一定涵蓋每個像素。原始深度圖片和相符的信心圖片也可以進一步處理,讓應用程式只使用深度資料,以便在個別用途中達到足夠的準確度。
裝置相容性
所有支援 Depth API 的裝置皆可使用原始深度。Raw Depth API 和完整 Depth API 一樣,不需要支援的硬體深度感應器,例如飛行時間 (ToF) 感應器。不過,Raw Depth API 和完整 Depth API 都會使用裝置可能具備的任何支援硬體感應器。
原始深度 API 與完整深度 API
Raw Depth API 可提供更精確的深度估計值,但原始深度圖片可能不會包含相機圖片中所有像素的深度估計值。相較之下,完整的 Depth API 會為每個像素提供預估深度,但由於深度預估值會經過平滑處理和內插,因此每個像素的深度資料可能較不準確。兩個 API 的深度圖片格式和大小相同。只有內容不同。
下表說明使用廚房內椅子和桌子的圖片,Raw Depth API 和完整 Depth API 之間的差異。
API |
傳回 |
相機圖片 |
深度圖像 |
可信度圖片 |
Raw Depth API
|
- 原始深度圖像,其中包含相機圖像中部分 (但非全部) 像素的深度估計值。
- 可為每個原始深度圖像像素提供信心值的信心圖像。相機圖像像素如果沒有深度估計值,其信心值為零。
|
|
|
|
Full Depth API
|
- 單一「平滑」深度圖片,其中包含每個像素的深度估計值。
- 這個 API 不會提供信心圖片。
|
|
|
無
|
信心圖片
在 Raw Depth API 傳回的信心圖像中,較亮的像素具有較高的信心值,白色像素代表完全有信心,黑色像素代表完全沒有信心。一般來說,相機影像中紋理較多的區域 (例如樹木) 比紋理較少的區域 (例如空白牆壁) 具有較高的原始深度信心。沒有紋理的表面通常會產生零信心。
如果目標裝置有支援的硬體深度感應器,即使在無紋理表面上,系統也可能會更有信心地判斷相機附近的圖像區域。
運算費用
Raw Depth API 的運算成本約為完整 Depth API 的半數。
用途
您可以使用 Raw Depth API 取得深度圖片,以便更詳細地呈現場景中物件的幾何圖形。在創建 AR 體驗時,如果需要在幾何形狀理解工作中提高深度準確度和細節,原始深度資料就會很實用。部分用途包括:
必要條件
請務必先瞭解基本 AR 概念,並瞭解如何設定 ARCore 工作階段,再繼續操作。
啟用深度
在新的 ARCore 工作階段中,確認使用者的裝置是否支援 Depth。由於處理效能受限,並非所有 ARCore 相容裝置都支援 Depth API。為了節省資源,ARCore 預設會停用深度功能。啟用深度模式,讓應用程式使用 Depth API。
var occlusionManager = // Typically acquired from the Camera game object.
// Check whether the user's device supports the Depth API.
if (occlusionManager.descriptor?.supportsEnvironmentDepthImage)
{
// If depth mode is available on the user's device, perform
// the steps you want here.
}
取得最新的原始深度影像
呼叫 AROcclusionManager.TryAcquireEnvironmentDepthCpuImage()
並使用 AROcclusionManager.environmentDepthTemporalSmoothingRequested
,即可在 CPU 上取得最新的原始深度圖像。
取得最新的原始深度信心圖片
呼叫 AROcclusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage()
並使用 AROcclusionManager.environmentDepthTemporalSmoothingRequested
在 CPU 上取得信心圖片。
// Attempt to get the latest environment depth image.
if (occlusionManager && occlusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage(out XRCpuImage image))
{
using (image)
{
UpdateRawImage(m_RawEnvironmentDepthConfidenceImage, image);
}
}
else
{
m_RawEnvironmentDepthConfidenceImage.enabled = false;
}
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2025-07-26 (世界標準時間)。
[null,null,["上次更新時間:2025-07-26 (世界標準時間)。"],[[["\u003cp\u003eThe Raw Depth API delivers higher accuracy depth data compared to the full Depth API, but may not cover every pixel in the camera image.\u003c/p\u003e\n"],["\u003cp\u003eRaw depth images come with corresponding confidence images, allowing developers to filter data based on accuracy for specific use cases.\u003c/p\u003e\n"],["\u003cp\u003eConfidence images use lighter pixels to indicate higher confidence levels, with white representing full confidence and black representing no confidence.\u003c/p\u003e\n"],["\u003cp\u003eWhile requiring less computational power than the full Depth API, the Raw Depth API is ideal for applications like 3D reconstruction, measurement, and shape detection that need precise depth information.\u003c/p\u003e\n"],["\u003cp\u003eThis API is supported by all devices compatible with the Depth API and leverages any available hardware depth sensors for enhanced performance.\u003c/p\u003e\n"]]],["The Raw Depth API provides highly accurate depth data for some pixels in a camera image, unlike the full Depth API, which estimates depth for all pixels with less accuracy. The Raw Depth API also returns a confidence image, indicating the accuracy of each pixel's depth. Both APIs use the same image format and size. Raw Depth is computationally less expensive, and is suitable for applications like 3D reconstruction and measurement, but must be enabled in the ARCore session after checking device compatibility, and can be acquired via `AROcclusionManager.TryAcquireEnvironmentDepthCpuImage()` and `AROcclusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage()`\n"],null,["# Use Raw Depth in your AR Foundation Android app\n\nThe Raw Depth API provides depth data for a camera image that has higher accuracy than full Depth API data, but does not always cover every pixel. Raw depth images, along with their matching confidence images, can also be further processed, allowing apps to use only the depth data that has sufficient accuracy for their individual use case.\n\nDevice compatibility\n--------------------\n\nRaw Depth is available on all [devices that support the Depth API](/ar/devices). **The Raw Depth API, like the full Depth API, does not require a supported hardware depth sensor, such as a time-of-flight (ToF) sensor.** However, both the Raw Depth API and the full Depth API make use of any supported hardware sensors that a device may have.\n\nRaw Depth API vs full Depth API\n-------------------------------\n\nThe Raw Depth API provides depth estimates with higher accuracy, but raw depth images may not include depth estimates for all pixels in the camera image. In contrast, the full Depth API provides estimated depth for every pixel, but per-pixel depth data may be less accurate due to smoothing and interpolation of depth estimates. **The format and size of depth images are the same across both APIs.** Only the content differs.\n\nThe following table illustrates the differences between the Raw Depth API and the full Depth API using an image of a chair and a table in a kitchen.\n\n| API | Returns | Camera image | Depth image | Confidence image |\n|----------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|-------------|------------------|\n| Raw Depth API | - A raw depth image that contains a very accurate depth estimate for some, but not all, pixels in the camera image. - A confidence image that gives the confidence for every raw depth image pixel. Camera image pixels that do not have a depth estimate have a confidence of zero. | | | |\n| Full Depth API | - A single \"smoothed\" depth image that contains a depth estimate for every pixel. - No confidence image is provided with this API. | | | N/A |\n\n### Confidence images\n\nIn confidence images returned by the Raw Depth API, lighter pixels have higher confidence values, with white pixels representing full confidence and black pixels representing no confidence. In general, regions in the camera image that have more texture, such as a tree, will have higher raw depth confidence than regions that don't, such as a blank wall. Surfaces with no texture usually yield a confidence of zero.\n\nIf the target device has a supported hardware depth sensor, confidence in areas of the image close enough to the camera will likely be higher, even on textureless surfaces.\n\n### Compute cost\n\nThe compute cost of the Raw Depth API is about half of the compute cost for the full Depth API.\n\nUse cases\n---------\n\nWith the Raw Depth API, you can obtain depth images that provide a more detailed representation of the geometry of the objects in the scene. Raw depth data can be useful when creating AR experiences where increased depth accuracy and detail are needed for geometry-understanding tasks. Some use cases include:\n\n- 3D reconstruction\n- Measurement\n- Shape detection\n\nPrerequisites\n-------------\n\nMake sure that you understand [fundamental AR concepts](/ar/develop/fundamentals)\nand how to [configure an ARCore session](/ar/develop/unity-arf/session-config) before proceeding.\n\nEnable Depth\n------------\n\nIn a [new ARCore session](/ar/develop/unity-arf/session-config), check whether a user's device supports Depth. Not all ARCore-compatible devices support the Depth API due to processing power constraints. To save resources, depth is disabled by default on ARCore. Enable depth mode to have your app use the Depth API. \n\n var occlusionManager = // Typically acquired from the Camera game object.\n\n // Check whether the user's device supports the Depth API.\n if (occlusionManager.descriptor?.supportsEnvironmentDepthImage)\n {\n // If depth mode is available on the user's device, perform\n // the steps you want here.\n }\n\nAcquire the latest raw depth image\n----------------------------------\n\nCall [`AROcclusionManager.TryAcquireEnvironmentDepthCpuImage()`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_TryAcquireEnvironmentDepthCpuImage_UnityEngine_XR_ARSubsystems_XRCpuImage__) and use [`AROcclusionManager.environmentDepthTemporalSmoothingRequested`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_environmentDepthTemporalSmoothingRequested) to acquire the latest raw depth image on the CPU.\n\nAcquire the latest raw depth confidence image\n---------------------------------------------\n\nCall [`AROcclusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage()`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_TryAcquireEnvironmentDepthConfidenceCpuImage_UnityEngine_XR_ARSubsystems_XRCpuImage__) and use [`AROcclusionManager.environmentDepthTemporalSmoothingRequested`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api//UnityEngine.XR.ARFoundation.AROcclusionManager.html#UnityEngine_XR_ARFoundation_AROcclusionManager_environmentDepthTemporalSmoothingRequested) to acquire the confidence image on the CPU. \n\n // Attempt to get the latest environment depth image.\n if (occlusionManager && occlusionManager.TryAcquireEnvironmentDepthConfidenceCpuImage(out XRCpuImage image))\n {\n using (image)\n {\n UpdateRawImage(m_RawEnvironmentDepthConfidenceImage, image);\n }\n }\n else\n {\n m_RawEnvironmentDepthConfidenceImage.enabled = false;\n }"]]