示例应用提供了一个类,用于将您的增强现实人脸转换为 SCNGeometry 对象。您可以使用此几何图形轻松附加到 SceneKit 节点,并将其放置在增强现实人脸的 Center 转换中。
let faceNode = SCNNode()
// Gets the most recent frame's face
let face = faceSession.currentFrame?.face
// This is instantiated once, not with every frame
let faceGeometryConverter = FaceMeshGeometryConverter()
// Converts Augmented Face to SCNGeometry object
let faceMesh = faceGeometryConverter.geometryFromFace(face)
// Assigns geometry to node and sets the pose
faceNode.geometry = faceMesh
faceNode.simdTransform = face.centerTransform
// Create node and add to sceneletnode=SCNNode(geometry:SCNSphere(radius:.02))sceneView.rootNode.addChild(node)// Every frame updates the node's positionnode.simdWorldTransform=session.currentFrame.face.transform(for:.nose)
[null,null,["最后更新时间 (UTC):2025-07-26。"],[[["\u003cp\u003eThis guide provides instructions for building an iOS app that augments faces using ARCore's Augmented Faces API.\u003c/p\u003e\n"],["\u003cp\u003eThe API detects a face, provides a face mesh, and offers anchor points for attaching 3D objects or applying textures.\u003c/p\u003e\n"],["\u003cp\u003eDevelopers can import their own 3D models and textures to create custom augmented face effects.\u003c/p\u003e\n"],["\u003cp\u003eThe app requires an ARKit-compatible device running iOS 12.0 or later and utilizes the SceneKit framework for rendering.\u003c/p\u003e\n"],["\u003cp\u003eDetailed steps for setup, implementation, and asset integration are provided, including code snippets and best practices.\u003c/p\u003e\n"]]],["To use Augmented Faces in apps, initialize a `GARAugmentedFaceSession` with the camera's field of view and set a delegate. Capture camera images using `AVCaptureSession` and pass them to the session using `faceSession.update`, which triggers a delegate callback with a `GARAugmentedFaceFrame`. This frame provides the detected face, captured image, and timestamp. Use the face object to attach 2D textures or 3D objects. Import `.scn` assets into Xcode, and use `face.transform` to attach 3D objects at specific face regions like the nose or forehead.\n"],null,["# Augmented Faces developer guide for iOS\n\nLearn how to use Augmented Faces in your own apps. \n\nPrerequisites\n-------------\n\n- [Xcode](https://developer.apple.com/xcode/) version 13.0 or later\n- [Cocoapods](https://cocoapods.org/) 1.4.0 or later if using Cocoapods\n- An ARKit-compatible Apple device running iOS 12.0 or later (deployment target of iOS 12.0 or later required)\n\n| **Note:** Beginning with ARCore 1.12, all ARKit-compatible devices are supported.\n\nBuild and run the sample app\n----------------------------\n\nSee the [Quickstart](/ar/develop/ios/augmented-faces/quickstart) for detailed steps.\n\n1. Clone or download the [ARCore SDK for iOS](https://github.com/google-ar/arcore-ios-sdk/releases) from GitHub to obtain the sample app code.\n2. Open a Terminal window and run `pod install` from the folder where the Xcode project exists.\n3. Open the sample app in Xcode version 10.3 or greater and connect the device to your development machine via USB. To avoid build errors, make sure you are building from the `.xcworkspace` file and not the `.xcodeproj` file.\n4. Press Cmd+R or click **Run**. Use a physical device, not the simulator, to work with Augmented Faces.\n5. Tap \"OK\" to give the camera access to the sample app. The app should open the front camera and immediately track your face in the camera feed. It should place images of fox ears over both sides of your forehead, and place a fox nose over your own nose.\n\nOverview of implementing Augmented Faces in your app\n----------------------------------------------------\n\n### Import `*.scn` files into Xcode\n\nTo add your own assets such as textures and 3D models to a detected face in your app, drag the `*.scn` asset into Xcode.\n\n### Initialize an Augmented Faces session\n\nTo use the Augmented Faces API from your app, initialize an Augmented Faces session. This session is responsible for taking in camera images at 60 fps, and will asynchronously return face updates to a delegate method. When initializing, simply pass the capture device's field of view, and make sure you set the delegate. \n\n // Session takes a float for field of view\n let faceSession = try? GARAugmentedFaceSession(fieldOfView: cameraFieldOfView)\n faceSession?.delegate = self\n\n### Pass camera images to the session\n\nNow that your session is initialized and configured properly, your app can start sending camera images to the session. The sample app gets camera images by creating an `AVCaptureSession` with video frames from the front camera.\n\nThe following code sample shows an implementation of `AVFoundation`'s capture output delegate method, which passes the image, a timestamp, and a recognition rotation to your face session. \n\n func captureOutput(_ output: AVCaptureOutput,\n didOutput sampleBuffer: CMSampleBuffer,\n from connection: AVCaptureConnection) {\n\n faceSession.update(with: imageBuffer,\n timestamp: frameTime,\n recognitionRotation: rotationDegrees)\n }\n\nAfter the image is processed, the Augmented Faces API sends a delegate callback that returns a `GARAugmentedFaceFrame`. It contains an Augmented Face object which helps you attach effects to the face. It also contains the image buffer and the timestamp that you passed into the update method. This is useful for synchronizing the face effects to the images. This object also gives you a display transform and a projection matrix to make sure you can set up the 3D world and 2D views in a way that makes it easy to render your face effects that appear attached to the detected face. \n\n var face: GARAugmentedFace? { get }\n var capturedImage: CVPixelBuffer { get }\n var timestamp: TimeInterval { get }\n\n### Face mesh orientation\n\nNote the orientation of the face mesh for iOS:\n\n### Apply a 2D texture to face\n\nThe [Sample app](https://github.com/google-ar/arcore-ios-sdk) provides a class to convert your Augmented Face to an `SCNGeometry` object. You can use this geometry to easily attach to a SceneKit node, which you will place at the Augmented Face's Center transform. \n\n let faceNode = SCNNode()\n\n // Gets the most recent frame's face\n let face = faceSession.currentFrame?.face\n\n // This is instantiated once, not with every frame\n let faceGeometryConverter = FaceMeshGeometryConverter()\n\n // Converts Augmented Face to SCNGeometry object\n let faceMesh = faceGeometryConverter.geometryFromFace(face)\n\n // Assigns geometry to node and sets the pose\n faceNode.geometry = faceMesh\n faceNode.simdTransform = face.centerTransform\n\nThe 2D face texture is loaded as a `UIImage` and set to a material that is attached to the geometry of the face mesh. \n\n faceTextureMaterial = SCNMaterial()\n faceTextureMaterial.diffuse.contents = UIImage(named:@\"face.png\")\n\n faceMesh?.firstMaterial = faceTextureMaterial\n\n### Attach 3D objects to the face\n\nThe `GARAugmentedFace` received from the delegate callback, provides 3 different regions, or transforms, you can use for attaching content to a face. These transforms allow you to get the nose, left of the forehead, and right of the forehead in world space. Here, a nose transform is used to attach a sphere to the nose. \n\n // Create node and add to scene\n let node = SCNNode(geometry: SCNSphere(radius: .02))\n sceneView.rootNode.addChild(node)\n\n // Every frame updates the node's position\n node.simdWorldTransform = session.currentFrame.face.transform(for: .nose)\n\nImport your own assets into Xcode\n---------------------------------\n\nTo add [assets](/ar/develop/ios/augmented-faces/create-assets) such as textures and 3D models to a detected face in your app, first import the assets into Xcode.\n\n1. Export a `*.dae` (3D model) file.\n2. Drag the `*.dae` file into the Xcode project.\n3. Convert the file into `.scn` format in Xcode by going to **Editor \\\u003e Convert to SceneKit scene file format (.scn)**.\n\n| **Note:** Depending on the 3D modelling software used to export the `*.dae` file, the exported model may incur an additional rotation that will be need to be accounted for at design time in the 3D software, or at runtime in the app."]]