Skip to main content

Imagine being on a blind date: You can’t really tell whether this person is interested in hearing about your day or itching to get the cheque. So, you pull out your phone, and point your camera at their face.

Soon, we’ll be able to use technology to gauge what others are feeling, thanks in part to Tadas Baltrusaitis, a postdoctoral researcher at the Carnegie Mellon University Language Technologies Institute who specializes in facial analysis.

Along with researchers who focus on artificial intelligence and communication, Baltrusaitis has developed an open-source, facial-recognition software called OpenFace.

He says it may one day help doctors better diagnose patients, educators better understand students, and all of us better understand our dates. He spoke to The Globe and Mail over the phone from New York.

How would facial-recognition software help humans better relate to one another? What’s the advantage for us?

One thing we’re interested in is facial-expression analysis for behaviour settings. Some people who have sensory-processing disorders – such as autism or attention-deficit hyperactivity disorder – might not be able to understand others’ expressions. They might be able to tell whether someone is angry, happy or sad, but they might not necessarily recognize the facial cues.

With technology like OpenFace, such as smart glasses with a camera attached, someone like this could be fed that information and understand that perhaps the other person is angry with you, maybe you should react accordingly.

Researcher Tadas Baltrusaitis says facial-recognition software could be used in all kinds of situations, from helping doctors better diagnose patients to providing educators better understanding of their students. (iStock)

We’re also looking at applications within medical scenarios. For example, it could help clinicians diagnose someone with psychosis by looking at their facial movements to determine whether they’re suffering from delusions or hallucinations. Facial-recognition software could provide doctors with objective measurements.

At the moment, most measurements are subjective and dependent on the clinician’s experience.

Education is another big area where this could be applied. If you’re learning through an online course, a professor could be notified as to whether you’re bored or confused.

That would then help them adapt the lesson to make it more challenging or discover where the lecture needs to be revised.

What would you say are the biggest drawbacks? What about privacy concerns?

That’s always the ethical question. It would be up to the permission of the user, sort of like the privacy settings on Facebook.

Monitoring everyone obviously creates a lot of ethical concerns. Today, people are much more open to sharing emotions and the like via apps and social media. But it’s hard to say what the public’s reaction would be if it were rolled out on a bigger scale.

Are you working with any companies or on larger projects to roll it out in a big way?

We’re using it primarily in smaller-scale projects, small studies and the like.

Since it’s open source, we don’t always know who’s using it for what purpose. We’ve mostly seen interest from those in the education and medical sectors.

We haven’t had any large-scale studies yet. There are people designing mobile apps, too, that just want to track their faces. There are also some video-game designers who want to make avatars that can reflect the player’s expression.

Baltrusaitis’ software, OpenFace, is currently capable of tracking only Western facial expressions, but other researchers are working on other cultural groups. (iStock)

That brings up another question: How do you account for cultural differences and reactions?

That’s a very good point. There are certain cultural universals. There are basic emotions that are universal, to an extent, around the world. Everyone smiles when they’re happy. There are a lot of expressions that are culturally specific and you’d need to build specific systems that can track different cultural expressions. Currently, we’re only able to track Western facial expressions, but there are groups working with other cultures as well.

I know there are reports about cameras that don’t pick up darker skin colours. Would a facial expression from a darker-skinned person not be read correctly?

I remember one of the laptop companies had that issue. The algorithms these days are much more robust and don’t work the same way. Algorithms work with training data, and now we work with much larger data collections.

We’re having less and less of those issues now, because we have more diversity within our collected data.

This interview has been edited and condensed.