Intel's Emotional Recognition in AI for Schools
For their part, Intel and Class insist that their intent isn't to judge or penalize students
At AiSupremacy, I was thinking to myself that I want to get to know my audience better, so this week I’ve set up a new feature in the margin of the home page, like a guestbook, further dialogue era, a bookmark section and a Newsletter reading room lounge. I hope you can take the time to stop by!
Hey Guys,
Many moons ago we used to debate the importance of privacy in technology. We also used to debate the future of work in an age automation, funny how those stories no longer get talked about much.
This week however there was a topic that came up about sentiment A.I. tech. The truth is detecting emotions using A.I. is already a big industry and China with its advanced and pervasive use of facial recognition technology are the global leaders.
Facial Recognition Has Found its way to School
In America facial recognition cameras are being onboarded into some schools. Curiously, schools brought in surveillance cameras to monitor mask compliance and other Covid risks – and while masks are on their way out, the cameras aren’t.
Now Intel and Classroom Technologies, which sells virtual school software called Class, have partnered to integrate an AI-based technology developed by Intel with Class, which runs on top of Zoom. Intel claims its system can detect whether students are bored, distracted or confused by assessing their facial expressions and how they’re interacting with educational content.
What do you suppose rates of cameras are in American schools today? Twenty years ago, security cameras were present in 19% of schools, according to the National Center for Education Statistics. Today, that number exceeds 80%.
However the Zoom era for virtual learning also means more A.I. embedded in how we understand the student population, levels of engagement and some potentially pretty crazy snooping.
Intel is testing new AI software that will detect the emotions of students while they are in an online classroom setting, and some are not happy about it. The software is designed to run on top of Zoom and is a collaboration with Classroom Technologies' virtual school software called Class, and is being backed by investors such as NFL quarterback Tom Brady and AOL co-founder Steve Case.
The West is Copying China’s Model of Student Surveillance
In the West, we were outraged this was happening in China just a few years ago, and now that it’s here, we seem nearly indifferent as if this was inevitable.
By definition, sentiment analysis is an analytical technique that uses statistics, natural language processing, and machine learning to determine the emotional meaning of communications. But this system is also ripe for abuse, bias and collecting data on kids.
A.I. that determines the emotions of kids is controversial to inevitable depending on who you talk to. Virtual school software startup Classroom Technologies will test the controversial “emotion AI” technology. This as Zoom is upgrading a lot of its abilities to empower sales people for instance.
What is clear is the Zoom era is also augmenting the facial recognition normalization in society. If China is a decade ahead in this as compared to the U.S., someone must want us to catch up.
Measuring Student Engagement with A.I.
The AI software is designed to detect if a student is bored, distracted, or confused by analyzing the student's facial expressions and incorporating how they are interacting with the educational content being presented. The idea is to give teachers additional insights into how the students are interacting and engaging.
Bored
Distracted
Confused
A.I. is starting to give both business development folk and educational professional more insights into their students.
As you can imagine, some parents and privacy advocacy groups aren’t happy about it.
Michael Chasen, co-founder and CEO of Classroom Technologies, hopes its software gives teachers additional insights, ultimately bettering remote learning experiences. Zoom itself is an American company founded by Chinese-American billionaire Eric Yuan. It’s not clear what Eric’s ties to China might be though he worked at Cisco and WebEx as a VP of Engineering. What is clear is that Zoom’s technologies has hastened facial recognition sentiment analysis compressing five years into about the last two of the pandemic.
AI-based technology developed by Intel with Class, which runs on top of Zoom. Intel claims its system can detect whether students are bored, distracted or confused, even when it comes to poker faced undergraduates who don’t typically give teachers or professions many non-verbal clues.
The Class blog was conspicuous by what it omitted. No real mention of emotion reading A.I. eh? They said:
Intel’s dedication to ensuring instructors and students have access to the technologies and tools needed to meet the challenges of the changing world aligns with everything that we’re working to accomplish at Class. Through our partnership with Intel, we’ll be able to bring new, immersive features to Class’ software that are driven by research-backed functionalities.
Many schools used the pandemic to enforce surveillance technologies. The Guardian reported that Remote learning during the pandemic ushered in a new era of digital student surveillance as schools turned to AI-powered services like remote proctoring and digital tools that sift through billions of students’ emails and classroom assignments in search of threats and mental health warning signs. I’m not a privacy expert, but that sounds rather invasive, if you ask me.
Classroom Technologies thus plans to test Intel’s student “engagement analytics technology”, which captures images of students’ faces with a computer camera and computer vision technology and combines it with contextual information about what a student is working on at that moment to assess a student’s state of understanding. One wonders if the students give their consent to such proceedings or is it just baked into the system?
The Practice of Normalizing Surveillance Capitalism
The software makes use of students' video streams, which it feeds into the AI engine alongside contextual, real-time information that allows it to classify students' understanding of the subject matter.
Sinem Aslan, a research scientist at Intel who helped develop the technology, says that the main objective is to improve one-on-one teaching sessions by allowing the teacher to react in real-time to each student's state of mind (nudging them in whatever direction is deemed necessary).
When we normalize mass and niche surveillance and sentiment analysis, where does it lead do you think? In the commercialization of A.I. we could even make the case how this is augmenting Teachers with A.I. But where does it end? If it’s already beginning to take over our schools, where does it stop? What’s good for Intel and Zoom, what if it’s bad for us?
Rating kids with A.I. is maybe not such a lovely idea.
Black Mirror Debates Linger over Affective Computing and Even its Efficacy in Doing what it Says its Doing
To rate students in attentiveness is very Black Mirror. Do we want to pigeon hole the emotions of Students like this? Is it even ethical?
According to Nese Alyuz Civitci, a machine-learning researcher at Intel, the company's model was built with the insight and expertise of a team of psychologists, who analyzed the ground truth data captured in real-life classes using laptops with 3D cameras.
The use of facial recognition like this is banned in some places, this affective computing breakthrough as they like to call it. According to critics, however, accurate determinations about how bored or confused a person is are not possible from their facial expressions and similar cues. Furthermore, a student’s reaction, particularly in a home environment, may be caused by a factor other than the educational material.
Speaking to Protocol, critics from multiple fields expressed severe reservations over the proposed system from Class and Intel. Research has found that people express themselves in almost infinite ways, essentially nullifying such a system.
Certainly new aspects of EdTech have the potential to free knowledge or to burden students with excessive surveillance; it can be a pedagogical boon or a punitive nightmare. Imagine the weird justifications that this allows. What if the A.I. deems me less compliant, what if my dark skin is harder to read? You can imagine the scenarios already.
Emotional expression in facial cues is also certainly not a uniform thing. The way people express emotions varies across cultures and situations. Even the best facial recognition software has issues with race, gender, and age according to a study by The U.S. National Institute of Standards and Technology (NIST).
So Intel might be taking a step into the darkness of surveillance tech with this one. In schools, facial recognition systems have been controversial. After a lawsuit from the New York Civil Liberties Union, New York lawmakers passed a moratorium on the facial recognition system, Aegis, which lied about how bad its model is at identifying Black faces.
The reality is there aren’t many A.I. ethics voices left in the choir.

What do you think, should emotion reading A.I. be used in schools and Colleges?
If you want to support me so I can keep writing, please don’t hesitate to give me tips, a paid subscription or some donation. With a conversion rate of less than two percent, this Newsletter exists mostly by the grace of my goodwill (passion for A.I.) & my own experience of material poverty as I try to pivot into the Creator Economy.
Anyways I hope you enjoyed the topic, that’s all for today.
Create your profile
Only paid subscribers can comment on this post
Check your email
For your security, we need to re-authenticate you.
Click the link we sent to , or click here to sign in.