Google is committed to advancing racial equity for Black communities. See how.

Augmented Faces developer guide for Android

Learn how to use the Augmented Faces feature in your own apps.

Using Augmented Faces in Android

  1. Configure the ARCore session
  2. Get access to the detected face

Configure the ARCore session

Initialize the session with FRONT_CAMERA. Note that selecting the front camera will cause a number of changes in ARCore behavior.

Java

session = new Session(this, EnumSet.of(Session.Feature.FRONT_CAMERA))

Kotlin

session = Session(this, EnumSet.of(Session.Feature.FRONT_CAMERA))

Enable AugmentedFaceMode:

Java

Config config = new Config(session);
config.setAugmentedFaceMode(AugmentedFaceMode.MESH3D);
session.configure(config);

Kotlin

val config = Config(session)
config.augmentedFaceMode = AugmentedFaceMode.MESH3D
session.configure(config)

Face mesh orientation

Note the orientation of the face mesh:

Access the detected face

Get a Trackable for each frame. A Trackable is something that ARCore can track and that Anchors can be attached to.

Java

// ARCore's face detection works best on upright faces, relative to gravity.
Collection<AugmentedFace> faces = session.getAllTrackables(AugmentedFace.class);

Kotlin

// ARCore's face detection works best on upright faces, relative to gravity.
val faces = session.getAllTrackables(AugmentedFace::class.java)

Get the TrackingState for each Trackable. If it is TRACKING, then its pose is currently known by ARCore.

Java

for (AugmentedFace face : faces) {
  if (face.getTrackingState() == TrackingState.TRACKING) {
    // UVs and indices can be cached as they do not change during the session.
    FloatBuffer uvs = face.getMeshTextureCoordinates();
    ShortBuffer indices = face.getMeshTriangleIndices();
    // Center and region poses, mesh vertices, and normals are updated each frame.
    Pose facePose = face.getCenterPose();
    FloatBuffer faceVertices = face.getMeshVertices();
    FloatBuffer faceNormals = face.getMeshNormals();
    // Render the face using these values with OpenGL
  }
}

Kotlin

for (face in faces) {
  if (face.trackingState == TrackingState.TRACKING) {
    // UVs and indices can be cached as they do not change during the session.
    val uvs = face.meshTextureCoordinates
    val indices = face.meshTriangleIndices
    // Center and region poses, mesh vertices, and normals are updated each frame.
    val facePose = face.centerPose
    val faceVertices = face.meshVertices
    val faceNormals = face.meshNormals
    // Render the face using these values with OpenGL
  }
}