• Sceneform SDK for Android was open sourced and archived (github.com/google-ar/sceneform-android-sdk) with version 1.16.0.
  • This site (developers.google.com/sceneform) serves as the documentation archive for the previous version, Sceneform SDK for Android 1.15.0.
  • Do not use version 1.17.0 of the Sceneform Maven artifacts.
  • The 1.17.1 Maven artifacts can be used. Other than the version, however, the 1.17.1 artifacts are identical to the 1.15.0 artifacts.

Augmented Faces developer guide for Sceneform

Learn how to use the Augmented Faces feature in your own apps.

Build and run the sample app

To build and run the AugmentedFaces Java app:

  1. Open Android Studio version 3.1 or greater. It is recommended to use a physical device (and not the Android Emulator) to work with Augmented Faces. The device should be connected to the development machine via USB. See the Android quickstart for detailed steps.

  2. Import the AugmentedFaces Java sample into your project.

  3. In Android Studio, click Run . Then, choose your device as the deployment target and click OK to launch the sample app on your device.

  4. Click Approve to give the camera access to the sample app.

    The app should open the front camera and immediately track your face in the camera feed. It should place images of fox ears over both sides of your forehead, and place a fox nose over your own nose.

Using Augmented Faces in Sceneform

  1. Import assets into Sceneform

  2. Configure the ARCore session

  3. Get access to the detected face

  4. Render the effect on the detected face

Import assets into Sceneform

Make sure that assets you use for Augmented Faces are scaled and positioned correctly. For tips and practices, refer to Creating Assets for Augmented Faces.

To apply assets such as textures and 3D models to an augmented face mesh in Sceneform, first import the assets.

At runtime, use ModelRenderable.Builder to load the *.sfb models, and use the Texture.Builder to load a texture for the face.

// To ensure that the asset doesn't cast or receive shadows in the scene,
// ensure that setShadowCaster and setShadowReceiver are both set to false.
    .setSource(this, R.raw.fox_face)
        modelRenderable -> {
          faceRegionsRenderable = modelRenderable;

// Load the face mesh texture.
    .setSource(this, R.drawable.fox_face_mesh_texture)
    .thenAccept(texture -> faceMeshTexture = texture);

Face mesh orientation

Note the orientation of the face mesh for Sceneform:

Configure the ARCore session

Augmented Faces requires the ARCore session to be configured to use the front-facing (selfie) camera and enable face mesh support. To do this in Sceneform, extend the ARfragment class, and override the configuration:

protected Set<Session.Feature> getSessionFeatures() {
  return EnumSet.of(Session.Feature.FRONT_CAMERA);

protected Config getSessionConfiguration(Session session) {
  Config config = new Config(session);
  return config;

Refer to this subclassed ArFragment class in your activity layout.

Get access to the detected face

The AugmentedFace class extends the Trackable class. In your app's activity, use AugmentedFace to get access to the detected face by calling it from the addOnUpdateListener() method.

// Get list of detected faces.
Collection<AugmentedFace> faceList = session.getAllTrackables(AugmentedFace.class);

Render the effect for the face

Rendering the effect involves these steps:

for (AugmentedFace face : faceList) {
  // Create a face node and add it to the scene.
  AugmentedFaceNode faceNode = new AugmentedFaceNode(face);

  // Overlay the 3D assets on the face.

  // Overlay a texture on the face.