Android 適用的擴增臉部開發人員指南
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
瞭解如何在你的應用程式中使用擴增臉孔功能。
必要條件
請務必瞭解基本 AR 概念
以及如何在繼續操作前設定 ARCore 工作階段。
在 Android 中使用擴增臉孔
- 設定 ARCore 工作階段
- 使用系統偵測到的臉孔
在現有的 ARCore 工作階段中選取前置鏡頭,即可開始使用擴增臉孔。請注意,選取前置鏡頭
都會造成幾次變更
執行迴圈
Java
// Set a camera configuration that usese the front-facing camera.
CameraConfigFilter filter =
new CameraConfigFilter(session).setFacingDirection(CameraConfig.FacingDirection.FRONT);
CameraConfig cameraConfig = session.getSupportedCameraConfigs(filter).get(0);
session.setCameraConfig(cameraConfig);
Kotlin
// Set a camera configuration that usese the front-facing camera.
val filter = CameraConfigFilter(session).setFacingDirection(CameraConfig.FacingDirection.FRONT)
val cameraConfig = session.getSupportedCameraConfigs(filter)[0]
session.cameraConfig = cameraConfig
啟用 AugmentedFaceMode
:
Java
Config config = new Config(session);
config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D);
session.configure(config);
Kotlin
val config = Config(session)
config.augmentedFaceMode = Config.AugmentedFaceMode.MESH3D
session.configure(config)
臉孔網格方向
請注意臉部網格的方向:
![]()
存取偵測到的臉孔
取得 Trackable
。Trackable
是 ARCore 可以追蹤的項目,且
錨點可附加至圖片。
Java
// ARCore's face detection works best on upright faces, relative to gravity.
Collection<AugmentedFace> faces = session.getAllTrackables(AugmentedFace.class);
Kotlin
// ARCore's face detection works best on upright faces, relative to gravity.
val faces = session.getAllTrackables(AugmentedFace::class.java)
取得 TrackingState
每個 Trackable
。如果是 TRACKING
,
那麼 ARCore 就能知道它的姿勢。
Java
for (AugmentedFace face : faces) {
if (face.getTrackingState() == TrackingState.TRACKING) {
// UVs and indices can be cached as they do not change during the session.
FloatBuffer uvs = face.getMeshTextureCoordinates();
ShortBuffer indices = face.getMeshTriangleIndices();
// Center and region poses, mesh vertices, and normals are updated each frame.
Pose facePose = face.getCenterPose();
FloatBuffer faceVertices = face.getMeshVertices();
FloatBuffer faceNormals = face.getMeshNormals();
// Render the face using these values with OpenGL.
}
}
Kotlin
faces.forEach { face ->
if (face.trackingState == TrackingState.TRACKING) {
// UVs and indices can be cached as they do not change during the session.
val uvs = face.meshTextureCoordinates
val indices = face.meshTriangleIndices
// Center and region poses, mesh vertices, and normals are updated each frame.
val facePose = face.centerPose
val faceVertices = face.meshVertices
val faceNormals = face.meshNormals
// Render the face using these values with OpenGL.
}
}
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2025-07-26 (世界標準時間)。
[null,null,["上次更新時間:2025-07-26 (世界標準時間)。"],[[["\u003cp\u003eThis guide explains how to integrate the Augmented Faces feature into your Android applications for real-time face tracking and augmentation.\u003c/p\u003e\n"],["\u003cp\u003eBefore starting, ensure you have a foundational understanding of AR concepts and ARCore session configuration.\u003c/p\u003e\n"],["\u003cp\u003eTo use Augmented Faces, configure your ARCore session to utilize the front-facing camera and enable the AugmentedFaceMode.\u003c/p\u003e\n"],["\u003cp\u003eAccess and track detected faces by obtaining a Trackable object for each frame and checking its TrackingState.\u003c/p\u003e\n"],["\u003cp\u003eOnce a face is tracked, you can access its pose, mesh data (vertices, normals, UVs, indices), and use them for rendering and augmentation.\u003c/p\u003e\n"]]],["To use Augmented Faces, first configure the ARCore session to use the front camera and enable `AugmentedFaceMode` with `MESH3D`. Then, access the detected face via `getAllTrackables`, ensuring the `TrackingState` is `TRACKING`. Retrieve and cache face mesh UVs and indices. Each frame, update center/region poses, mesh vertices, and normals to render the face, using provided examples in Java and Kotlin.\n"],null,["# Augmented Faces developer guide for Android\n\nLearn how to use the Augmented Faces feature in your own apps.\n\nPrerequisites\n-------------\n\nMake sure that you understand [fundamental AR concepts](/ar/develop/fundamentals)\nand how to [configure an ARCore session](/ar/develop/java/session-config) before proceeding.\n\nUsing Augmented Faces in Android\n--------------------------------\n\n1. [Configure the ARCore session](#configure-session)\n2. [Get access to the detected face](#face-access)\n\n### Configure the ARCore session\n\nSelect the front camera in an existing ARCore session to start using Augmented Faces. Note that selecting the front camera\nwill cause a number of [changes](/ar/reference/java/com/google/ar/core/CameraConfig.FacingDirection#front)\nin ARCore behavior. \n\n### Java\n\n```java\n// Set a camera configuration that usese the front-facing camera.\nCameraConfigFilter filter =\n new CameraConfigFilter(session).setFacingDirection(CameraConfig.FacingDirection.FRONT);\nCameraConfig cameraConfig = session.getSupportedCameraConfigs(filter).get(0);\nsession.setCameraConfig(cameraConfig);\n```\n\n### Kotlin\n\n```kotlin\n// Set a camera configuration that usese the front-facing camera.\nval filter = CameraConfigFilter(session).setFacingDirection(CameraConfig.FacingDirection.FRONT)\nval cameraConfig = session.getSupportedCameraConfigs(filter)[0]\nsession.cameraConfig = cameraConfig\n```\n\nEnable [`AugmentedFaceMode`](/ar/reference/java/com/google/ar/core/Config.AugmentedFaceMode): \n\n### Java\n\n```java\nConfig config = new Config(session);\nconfig.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D);\nsession.configure(config);\n```\n\n### Kotlin\n\n```kotlin\nval config = Config(session)\nconfig.augmentedFaceMode = Config.AugmentedFaceMode.MESH3D\nsession.configure(config)\n```\n\n### Face mesh orientation\n\nNote the orientation of the face mesh:\n\n### Access the detected face\n\nGet a [`Trackable`](/ar/reference/java/com/google/ar/core/Trackable)\nfor each frame. A `Trackable` is something that ARCore can track and that\nAnchors can be attached to. \n\n### Java\n\n```java\n// ARCore's face detection works best on upright faces, relative to gravity.\nCollection\u003cAugmentedFace\u003e faces = session.getAllTrackables(AugmentedFace.class);\n```\n\n### Kotlin\n\n```kotlin\n// ARCore's face detection works best on upright faces, relative to gravity.\nval faces = session.getAllTrackables(AugmentedFace::class.java)\n```\n\nGet the [`TrackingState`](/ar/reference/java/com/google/ar/core/TrackingState)\nfor each `Trackable`. If it is [`TRACKING`](/ar/reference/java/com/google/ar/core/TrackingState#tracking),\nthen its pose is currently known by ARCore. \n\n### Java\n\n```java\nfor (AugmentedFace face : faces) {\n if (face.getTrackingState() == TrackingState.TRACKING) {\n // UVs and indices can be cached as they do not change during the session.\n FloatBuffer uvs = face.getMeshTextureCoordinates();\n ShortBuffer indices = face.getMeshTriangleIndices();\n // Center and region poses, mesh vertices, and normals are updated each frame.\n Pose facePose = face.getCenterPose();\n FloatBuffer faceVertices = face.getMeshVertices();\n FloatBuffer faceNormals = face.getMeshNormals();\n // Render the face using these values with OpenGL.\n }\n}\n```\n\n### Kotlin\n\n```kotlin\nfaces.forEach { face -\u003e\n if (face.trackingState == TrackingState.TRACKING) {\n // UVs and indices can be cached as they do not change during the session.\n val uvs = face.meshTextureCoordinates\n val indices = face.meshTriangleIndices\n // Center and region poses, mesh vertices, and normals are updated each frame.\n val facePose = face.centerPose\n val faceVertices = face.meshVertices\n val faceNormals = face.meshNormals\n // Render the face using these values with OpenGL.\n }\n}\n```"]]