When Sceneform was discontinued and archived back in 2020, it made creating ARCore apps on Android more difficult. Inspired by the original Sceneform APIs and based on the ARCore samples, I have created AugmentedFaceFragment and AugmentedFaceListener interface to be able to easily create ARCore AugmentedFaces features on Android.

I will be releasing 3 articles building different AugmentedFaces features using AugmentedFaceFragment and AugmentedFaceListener interface.

What you will build

Part 1 of the series will cover the basics of the AugmentedFaceFragment and AugmentedFaceListener as well as the overview of all the helper classes. At the end of this article you will build a simple demo by writing a few lines of code.

Project setup

Download the Starter Project

Clone the repository:

git clone https://github.com/droid-girl/arfaces_labs.git

What’s our starting point?

Our starting point is a modified version of ARCore SDK sample for Augmented Faces. The code has been modified in a way that we can now easily add different textures and objects to a face object.

Project structure

Additional files for the repository:

MainActivity setup

<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">

    <androidx.fragment.app.FragmentContainerView android:name="com.ar.arfaces.arface.AugmentedFaceFragment"
        android:id="@+id/face_view"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:layout_gravity="top" />

</LinearLayout>

class MainActivity : AppCompatActivity(), AugmentedFaceListener {

    private lateinit var binding: ActivityMainBinding

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        binding = ActivityMainBinding.inflate(layoutInflater)
        binding.faceView.getFragment<AugmentedFaceFragment>().setAugmentedFaceListener(this)
        setContentView(binding.root)
    }

    override fun onFaceAdded(face: AugmentedFaceNode) {}

    override fun onFaceUpdate(face: AugmentedFaceNode) {}

}

In activity_main.xml we add AugmentedFaceFragment as a main view and to be able to receive events from this fragment, we need to define and set AugmentedFaceListener to our fragment.

onFaceAdded method will be called when ARCore detects a new face and onFaceUpdate will be called on each frame update.

Let’s add a face texture as our next step.

Add face texture

We will use ARCore sample assets for our first face mask. You can find assets/models folder in your project. Let’s add freckles.png as face texture:

In MainActivity.kt modify onFaceAdded method as follows:

override fun onFaceAdded(face: AugmentedFaceNode) {
   face.setFaceMeshTexture("models/freckles.png")
}

Add 3D models

Introduction to AugmentedFaceNode

AugmentedFace class uses the face mesh and center pose to identify face regions. These regions are:

This project includes the AugmentedFaceNode class. Inspired by Sceneform, it is a node that will render visual effects on a detected face using ARCore. AugmentedFaceNode defines same face regions as AugmentedFace in a companion object:

companion object {
   enum class FaceLandmark {
       FOREHEAD_RIGHT,
       FOREHEAD_LEFT,
       NOSE_TIP
   }
}

Later in this tutorial, we will extend FaceLandmark enum class and add our own face regions.

AugmentedFaceNode includes faceLandmarks HashMapthat connects FaceLandmark with a 3D model.

The source code for this project can be found here.