bottom-wave

Emotion AI in Education

Emotion AI is a new layer of personalized learning

Take a look at learning and you’ll find innovation everywhere. Classrooms are flipped and smart. Learning is accelerated, personalized, and adaptive. Innovative solutions for remediation, student progress records, and tools for parent communication are available and widely used by school systems and education providers across the globe.

Yet a huge gap in the education technology landscape remains. Ask any teacher the role that human emotion plays in learning and they will fill you in on the importance of grit, confidence, communication, and social skills.

Biman Liyanage, Chatterize’s artifical intelligence advisor, is a researcher, inventor, and serial entrepreneur specialized in Emotional AI (affective computing) applications. In this interview, he explains what Emotion AI is, and talks about the potential impact on learning.

1. What is Emotion AI?

Artificial intelligence (AI) has been around for a long time. Emotion AI (or affective computing) started awhile back, but with the purpose of understanding emotions in context. For example, if I say “I’m hungry,” the natural language processing (NLP) engine can understand this as a request. If it understands the emotions linked with this request, it could actually understand the urgency. Does “I’m hungry” mean “I want to eat now” or “I’m thinking about what I’m going to eat in an hour”? The context gives a different meaning to what is being said.

Emotion AI has two parts. The first part is emotion recognition, and the other is emotional response. Emotion recognition is the ability of machines to detect emotions from a variety of formats. We use facial and voice emotion recognition, brainwaves, and other things to figure out what people are feeling. The response part of affective computing is applying the emotions. For example, in a voice chatbot, we can make (the chatbot) sound more human, like what you see in the Google Duplex’s chatbots

2. What are some of the challenges of this technology?

The first challenge is that you need experts that actually understand emotions to come up with the basic framework to tag the dataset. Emotion experts, who actually have the ability to detect micro-feature changes in emotions, look at video clips and tag data. This way we can get a very good high-quality dataset. It’s an expensive process. Once that is done, novel techniques of unsupervised learning can also be used, but you still need experts to identify error rates. An average human has less than a 30% ability to detect an emotion. Even in standard benchmark datasets like IEMOCAP, human performance is at 64%. But our algorithms outperform speech emotion detection by up to 88%, so machines are becoming better at detecting voice emotions than humans.

Second, the challenge is how you come up with the features and relate them to emotions. For example, you need to understand which vocal features are related to which emotion. It’s very challenging. You need to have very domain-specific expertise to look at speech signals and understand what these features are in relation to emotion changes.

Third, emotions are changing all the time. Emotions last for a very small period of time and our human brain cannot capture them very well. It’s also a multimodal problem: a person expresses emotions with their face, eyes, breath, words, and voice. If I really want to understand what someone is feeling right now, I should be an expert in EEG, voice, face, breath to be able to have a slight understanding about what someone is feeling.

3. How can we use Emotion AI in education? 

In education, there are a couple of applications. The first application is understanding the focus level of the student in an online or offline classroom. Using Emotion AI, you can actually understand in which part of the content the student is getting lost and confused. For example, if the student is not understanding that two hydrogen molecules and one oxygen molecule create water with the current medium, the teacher can try a different medium. They can try video or VR, or different media, until they see whether the student actually understood the content. This enables personalized learning. And we measure all of this by simply looking at the face. 

The second application is that through voice, we can understand whether students are stressed or not. For example, you can understand their normal daily engagement and frustration levels. We need to have these mental health indexes to teach students how to deal with strong emotions. A student with a clean and clear mind can learn much better. 

Lastly, Emotion AI can help online teaching platforms better match students and teachers, by analyzing teachers in the recruitment interviews, or from observing student and teacher interactions. 

4. There is some fear about AI replacing teachers in the future. What do you think about this statement? 

AI will never replace a teacher. The most they can do is replace a teaching assistant. The goal of AI is to automate clerical tasks, like preparing homework, taking attendance and others. This way, teachers can focus on creating novel content, new methods of teaching, figuring out new logic models so that AI can repeat these logic models at scale. For example, on a lesson about how gravity works, the teacher can just create one sample question, and the AI can figure out ten other problems about gravity. This saves a lot of time for the teacher.

The teacher should be spending more time understanding the students, understanding the classroom. AI can definitely replace the best teaching assistant, but it will never replace the teacher. The teacher and student relationship is much more about the emotional connection.

5. Any other comments about what people should keep in mind when using Emotion AI in education?

I am a huge believer of conscious engineering. This means that engineers have to understand and bear the responsibility for the things that they build. They need to take PR and marketing out of the equation and really look at the business and social value of their creation, and what it could happen at a bigger scale.

If you are using AI to just figure out which student is good, which one is bad, that’s a bad and dangerous application of Emotion AI in education. You should be coming up with how to improve low performing students’ learning. Or how to use automated tools to get that student beyond the curve. AI should also not be built using algorithms on biased data. People cannot use an algorithm that was built in Africa in China, for example. Using AI for test scoring is also bad (unless it is for multiple-choice questions). We have seen that even bad handwriting can influence a teacher’s scoring. But AI can translate the handwriting into the text so that the teacher can judge the answer, and not the handwriting. Another bad application is if you’re using AI to create the benchmarks. AI should be an assistive tool, so that teachers can come up with the right insights, and more comprehensive decisions about holistic education systems. 


Emotion AI in education is an innovative concept, a new layer on learning that is personalized to each unique student. We are honored to have Biman’s Emotion AI expertise on board as we work to support students on their way to confident communication.

Interested in trying out Chatterize’s AI solution? Check out Speakia here.

testimonial-top-wave

Recommended by parents, kids and teachers!

The app design is vivid and beautiful. Speakia’s learning content is closely related to daily life. If parents encourage their children to learn every day, or parents use Speakia together with their children, their oral English will improve rapidly.

- Luella (English teacher)

arrow-left

It’s great that my child can speak English every day, and also great that I don’t have to force them to sit down and study. My child speaks much more English in Speakia than they do in English class!

- Lucy (mother of an 8 year old)

arrow-center

I like playing Speakia. I can talk with my cartoon friends in English. I don’t feel scared and embarrassed to speak English anymore.

- Duke (6 years old)

arrow-right
mobile-top-wave
download-icon
download-deco

Start speaking English today!

appStoreBtn
googlePlayBtn
download-holon
download-certified
footer-wave mobile-bottom-wave