Augmented Faces developer guide for Unreal


  • This guide assumes you have already installed and configured Unreal Engine 4.21 with the GoogleARCore plugin 1.7+. If not, see the Quickstart for Unreal for installation and setup steps.

  • Make sure you have assets for overlaying on a face.

Build and run the sample

Download the arcore-unreal-sdk to get the Augmented Faces sample project.

For instructions on building and running the sample project, see the Quickstart for Unreal.

The AugmentedFaces sample app on GitHub overlays the facial features of a fox onto a user's face using both the assets of a model and a texture.

Overview of implementing Augmented Faces in your app

Implementing Augmented Faces in your app involves these steps:

  1. Configure your app to support Augmented Faces

  2. Get access to the augmented face mesh

  3. Attach a model to a region on the face

  4. Attach a texture to the face mesh

Configuring your app to support Augmented Faces

Configure your app by adding Augmented Faces to your session and setting specific options.

After you configure your app to support Augmented Faces, it will automatically create an augmented face mesh from the first detected face.

  1. Create a new session configuration for the ARCore app. (Right-click the content browser, then choose Miscellaneous > Data Asset.)
  2. Choose GoogleARCoreSessionConfig.
  3. Select it. This creates a new session configuration.
  4. Set Camera Facing to Front.
  5. Set Augmented Face Mode to Pose and Mesh.
  6. Save the session.

Face mesh orientation

Note the orientation of the face mesh for Unreal:

Attach a model to a region on the face

  1. Before you do this, make sure you’ve imported a face attachment model.
  2. Create an Actor blueprint.
  3. Add a component of type SkeletalMesh.
  4. In the Details of the new SkeletalMesh:
    1. Under Animation Mode, select the animation blueprint you just created.
    2. For Skeletal Mesh, select the Skeletal Mesh you just added.
  5. Use the Get All Augmented Faces blueprint function to get a list of all detected augmented faces.
  6. Select the first augmented face, and use the Get Local to World Transform Of Region to get the region pose.
  7. Set the animation blueprint variable using the region pose. (See sample.)