[null,null,["最后更新时间 (UTC):2025-07-26。"],[[["\u003cp\u003eThis API allows you to render assets on top of human faces detected in your app using Augmented Faces in AR Foundation.\u003c/p\u003e\n"],["\u003cp\u003eFaces are detected using the \u003ccode\u003eARFaceManager\u003c/code\u003e which provides access to \u003ccode\u003eARFace\u003c/code\u003e objects containing face data.\u003c/p\u003e\n"],["\u003cp\u003eThe API provides a center pose, three region poses (forehead and nose), and a 3D face mesh with 468 vertices for each detected face.\u003c/p\u003e\n"],["\u003cp\u003eYou can visualize the face mesh using the \u003ccode\u003eARFaceMeshVisualizer\u003c/code\u003e or access individual vertices and normals through the \u003ccode\u003eARFace\u003c/code\u003e component.\u003c/p\u003e\n"],["\u003cp\u003eAR Foundation currently supports tracking only one face at a time on Android devices.\u003c/p\u003e\n"]]],["The content details using `ARFaceManager` to detect faces represented by `ARFace` objects. `ARFaceManager` triggers a `facesChanged` event with lists of added, updated, and removed faces. Upon detection, it instantiates a Prefab with an `ARFace` component. To setup the `ARFaceManager`, you must add it to a new game object. Access detected faces and use the `ARFace` component to get vertices, normals, and texture coordinates. The API provides a center pose, region poses, and a 3D mesh composed of 468 points.\n"],null,["# Augmented Faces developer guide for AR Foundation\n\n| **Note:** This API does not require ARCore Extensions for AR Foundation.\n\nLearn how to use [Augmented Faces](/ar/develop/unity-arf/augmented-faces/introduction) to render assets on top of human faces in your own app.\n\nPrerequisites\n-------------\n\nMake sure that you understand [fundamental AR concepts](/ar/develop/fundamentals)\nand how to [configure an ARCore session](/ar/develop/unity-arf/session-config) before proceeding.\n\nDetect faces\n------------\n\nFaces are represented by [`ARFace`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFace.html) objects that are created, updated, and removed by the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html). Once per frame, the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html) invokes a [`facesChanged`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html#UnityEngine_XR_ARFoundation_ARFaceManager_facesChanged) event that contains [three lists](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFacesChangedEventArgs.html#properties): faces that have been added, faces that have been updated, and faces that have been removed since the last frame. When the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html) detects a face in the scene, it will instantiate a Prefab with an [`ARFace`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFace.html) component attached to track the face. The Prefab can be left `null`.\n\nTo set up the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html), create a new game object and add the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html) to it.\n\n**Face Prefab** is the Prefab instantiated at the face's center pose. **Maximum Face Count** represents the maximum amount of faces that can be tracked.\n| **Note:** Currently, AR Foundation targeting Android devices can only track a single face at a time.\n\nAccess detected faces\n---------------------\n\nAccess detected faces through the [`ARFace`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFace.html) component, which is attached to the Face Prefab. [`ARFace`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFace.html) provides vertices, indices, vertex normals, and texture coordinates.\n\nParts of a detected face\n------------------------\n\nThe Augmented Faces API provides a center pose, three region poses, and a 3D face mesh.\n\n### Center pose\n\nThe center pose, which marks the center of a user's head, is the origin point of the Prefab instantiated by the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html). It is located inside the skull, behind the nose.\n\nThe axes of the center pose are as follows:\n\n- The positive X-axis (X+) points toward the left ear\n- The positive Y-axis (Y+) points upwards out of the face\n- The positive Z-axis (Z+) points into the center of the head\n\n### Region poses\n\nLocated on the left forehead, right forehead, and tip of the nose, region poses mark important parts of a user's face. The region poses follow the same axis orientation as the center pose.\n\nTo use the region poses, downcast the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html)'s subsystem to [`ARCoreFaceSubsystem`](https://docs.unity3d.com/Packages/com.unity.xr.arcore@4.2/api/UnityEngine.XR.ARCore.ARCoreFaceSubsystem.html) and use [`subsystem.GetRegionPoses()`](https://docs.unity3d.com/Packages/com.unity.xr.arcore@4.2/api/UnityEngine.XR.ARCore.ARCoreFaceSubsystem.html#UnityEngine_XR_ARCore_ARCoreFaceSubsystem_GetRegionPoses_TrackableId_Allocator_NativeArray_UnityEngine_XR_ARCore_ARCoreFaceRegionData___) to obtain pose information for each region. For an example of how to do this, see Unity's [usage sample](https://github.com/Unity-Technologies/arfoundation-samples/blob/main/Assets/Scripts/ARCoreFaceRegionManager.cs#:%7E:text=subsystem.GetRegionPoses) on GitHub.\n\n3D face mesh\n------------\n\nThe face mesh consists of 468 points that make up a human face. It is also defined relative to the center pose.\n\nTo visualize the face mesh, attach an [`ARFaceMeshVisualizer`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceMeshVisualizer.html) to the **Face Prefab** . The [`ARFaceMeshVisualizer`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceMeshVisualizer.html) will generate a [`Mesh`](https://docs.unity3d.com/ScriptReference/Mesh.html) that corresponds to the detected face, setting it as the mesh in the attached [`MeshFilter`](https://docs.unity3d.com/ScriptReference/MeshFilter.html)s and [`MeshCollider`](https://docs.unity3d.com/ScriptReference/MeshCollider.html)s. Use a [`MeshRenderer`](https://docs.unity3d.com/ScriptReference/MeshRenderer.html) to set the [`Material`](https://docs.unity3d.com/Manual/Materials.html) used to render the face.\n\nThe **AR Default Face Prefab** renders a default material on detected face meshes.\n\nFollow these steps to start using the AR Default Face:\n\n1. Set up an [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html).\n2. In the **Hierarchy** tab, use **+** \\\u003e **XR** \\\u003e **AR Default Face** to create a new face object. This object is temporary and can be deleted once you create the Face Prefab.\n\n3. Access the **AR Default Face** in the Inspector.\n\n4. Drag the newly created AR Default Face from the **Hierarchy** tab into the **Project Assets** window to [create a Prefab](https://docs.unity3d.com/Manual/CreatingPrefabs.html).\n\n5. Set the newly created Prefab as the Face Prefab in the [`ARFaceManager`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFaceManager.html)'s **Face Prefab** field.\n\n6. In the **Hierarchy** tab, delete the face object, as it's no longer needed.\n\n### Access individual vertices of the face mesh\n\nUse [`face.vertices`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFace.html#UnityEngine_XR_ARFoundation_ARFace_vertices) to access the positions of the face mesh's vertices. Use [`face.normals`](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/api/UnityEngine.XR.ARFoundation.ARFace.html#UnityEngine_XR_ARFoundation_ARFace_normals) to access the corresponding vertex normals.\n\n### Visualize individual vertices of the face mesh\n\nYou can use [Blender](https://www.blender.org/download/) to easily view the index numbers that correspond to a face mesh's vertices:\n\n1. Open Blender and import [`canonical_face_mesh.fbx`](https://github.com/google-ar/arcore-android-sdk/blob/master/assets/canonical_face_mesh.fbx) from GitHub.\n2. Navigate to **Edit \\\u003e Preferences \\\u003e Interface**.\n3. Under the **Display** menu, select **Developer Extras**.\n\n4. Select the face by clicking it in the 3D viewport, and press Tab to enter Edit Mode.\n\n5. Open the drop-down menu next to the **Overlays** viewport and select **Indices**.\n\n6. Highlight the vertex whose index number you would like to determine. To highlight all vertices, use **Select \\\u003e All**."]]