Dancey uses tried-and-tested methods such as asking students to identify their “most confusing point” — a concept or idea she said students still struggle with — after a lecture or discussion. “I ask them to write it down, share it, and we tackle it in class for everyone’s benefit,” she said.
But Intel and Classroom Technologies, which sells virtual school software called Class, think there might be a better solution. The companies have teamed up to integrate AI-powered technology developed by Intel with Class, which runs on top of Zoom. Intel says its system can detect if students are bored, distracted or confused by evaluating their facial expressions and how they interact with educational content.
“We can give the teacher additional information to help them communicate better,” said Michael Chasen, co-founder and CEO of Classroom Technologies, who said teachers had struggled to interact with students in classrooms. virtual classroom environments throughout the pandemic.
His company plans to test Intel’s student engagement analytics technology, which captures images of students’ faces with a computer camera and computer vision technology and combines them with contextual information about what what a student is currently working on to assess a student’s state of understanding. Intel hopes to turn the technology into a product it can distribute more widely, said Sinem Aslan, an Intel researcher who helped develop the technology.
“We’re trying to enable one-on-one tutoring on a large scale,” Aslan said, adding that the system is intended to help teachers recognize when students need help and inform how they might modify teaching materials to suit their needs. how students interact with the education system. content. “High levels of boredom will lead [students to] completely out of educational content,” Aslan said.
But critics argue that it’s not possible to accurately determine whether someone is feeling bored, confused, happy or sad based on their facial expressions or other external cues.
Some researchers have found that because people express themselves through dozens or hundreds of subtle and complex facial expressions, body gestures, or physiological cues, categorizing their condition with a single label is a maladaptive approach. Other research indicates that people communicate emotions such as anger, fear, and surprise in ways that vary across cultures and situations, and how they express their emotions can fluctuate at the individual level.
“Students have different ways of presenting what’s going on inside them,” said Todd Richmond, longtime educator and director of the Tech and Narrative Lab and professor at Pardee RAND Graduate School. “That distracted student at this time may be the appropriate and necessary state for him at this point in his life,” he said, if it comes to personal issues, for example.
A controversial emotion AI is seeping into everyday technology
The classroom is just one arena where controversial “emotional AI” is finding its way into everyday tech products and attracting investor interest. It also infiltrates delivery and passenger vehicles, as well as virtual sales and customer service software. After Protocol’s report last week on the technology’s use in business calls, Fight for the Future launched a campaign urging Zoom not to adopt the technology in its near-ubiquitous video conferencing software.
At this early stage, it’s unclear how Intel’s technology will be integrated into Class software, said Chasen, who expects the company to partner with one of the colleges it already works with to assess the Intel system. Chasen told Protocol that Classroom Technologies is not paying Intel to test the technology. The class is backed by investors including NFL quarterback Tom Brady, AOL co-founder Steve Case and Salesforce Ventures.
Intel has established partnerships to help distribute other nascent forms of AI that it has created. For example, in hopes of producing a system that turns joint and skeletal motion visualization data into analytics to monitor and improve athletic performance, the company has partnered with Purdue University and the app AiScout football scouting.
Educators and advocacy groups have sounded the alarm about excessive student surveillance and privacy breaches associated with facial recognition being deployed in schools for identification and security purposes. These concerns have accelerated as AI-based software has been used more often than ever during the pandemic, including technologies that monitor student behavior in hopes of preventing cheating on virtual tests and systems that track the content students see on their laptops in an effort to detect if they are at risk of self-harm.
The class already tracks how often students raise their hands during a session and offers a “proctor view” feature that allows teachers to monitor what students are looking at on their computers if students agree to share their desktop screen. with the instructors.
“I think we have to be very sensitive to people’s personal rights and not be too intrusive with these systems,” Chasen said.
Cameras as a social justice issue
As the virtual classroom has become the norm over the past two years, a debate has emerged among educators over whether or not to require students to turn on their cameras during class. Today, in Dancey’s English program, cameras are optional, in part because in virtual environments, students can communicate with instructors via their microphones or via chat.
But to capture the students’ facial expressions, Intel’s technology would need those cameras turned on.
“It’s become almost like having the cameras turned on as a social justice issue,” Dancey said. Not only do some students worry that others won’t see where or how they live, but turning on the cameras consumes power, which can be a problem for students using a mobile hotspot to connect in class. , she said.
“It’s kind of an invasion of privacy, and there are accessibility issues, because having your camera on uses a huge amount of bandwidth. It could literally cost them money. to do it,” Dancey said.
We don’t want this technology to be a surveillance system.
“Students shouldn’t have to control their appearance in class,” said Nandita Sampath, a policy analyst at Consumer Reports who specializes in algorithmic bias and liability issues, who said she wondered if students would have the ability to challenge inaccurate results if Intel system leads to negative consequences. “What cognitive and emotional states do these companies claim to be able to assess or predict, and what is the responsibility?” she says.
Aslan said the goal of Intel’s technology is not to monitor or penalize students, but rather to coach teachers and provide additional information so they can better understand when students need help. ‘aid. “We didn’t launch this technology as a surveillance system. In fact, we don’t want this technology to be a surveillance system,” Aslan said.
Sampath said Intel’s technology could be used to judge or penalize students even though that’s not the intent. “Maybe they don’t intend it to be the ultimate decision maker, but that doesn’t mean the teacher or the administrator can’t use it that way,” she said. declared.
Dancey said teachers also fear surveillance will be used against them. “A lot of times surveillance is used against instructors in really unfair ways,” she said. “I don’t think it would be paranoid to say, especially if this is going to measure ‘student engagement’ – TM, in quotes – that if I go up for a promotion or position, will this be part of my Evaluation? Could they say, ‘So-and-so had a low comprehension quotient?’ »
When Intel tested the system in a physical classroom, some teachers who participated in the study suggested that it provided useful insights. “I could see how I could catch some emotional challenges from students that I couldn’t have anticipated. [before]said a teacher, according to a document provided by Intel.
But while some teachers may have found it useful, Dancey said she wouldn’t want to use the Intel system. “I think most teachers, especially at the university level, would find this technology morally objectionable, like the panopticon. Frankly, if my institution offered it to me, I would refuse it, and if we were to use it, I would think twice about continuing to work here,” she said.
Preparation of AI data by psychologists
At this early stage, Intel aims to find the best ways to implement the technology so that it is most useful for teachers, Aslan said: “How can we do it in a way that it’s aligned about what the teacher does on a daily basis?
I think most teachers, especially at the university level, would find this technology morally objectionable.
Intel developed its adaptive learning analytics system by incorporating data collected from students during real-life classroom sessions using laptops equipped with 3D cameras. To label the ground-truth data used to train its algorithmic models, the researchers hired psychologists who watched videos of the students and categorized the emotions they detected in their expressions.
“We don’t want to make assumptions. That’s why we hired subject matter experts to label the data,” said Nese Alyuz Civitci, machine learning researcher at Intel. The researchers only used the data when at least two of the three labelers agreed on how a student’s expressions should be classified.
“It was really interesting to see these emotions — the states are really subtle, they’re very small differences,” Civitci said. “It was really difficult for me to identify these differences.”
Rather than evaluating Intel’s AI models for whether they accurately reflected students’ real emotions, the researchers “positioned it based on how much a teacher can trust the models.” , Aslan said.
“I don’t think it’s the technology that’s fully matured yet,” Chasen said of Intel’s system. “We need to see if the results are relevant to student performance and see if we can’t derive useful data from them for instructors. This is what we test to find out.
Ultimately, he said the Intel system will provide a piece of data that Classroom Technologies and its customers will combine with other signals to form a holistic student assessment.
“There is never just one datum,” he said. He also suggested that information revealed by Intel technology should not be used alone without context to judge a student’s performance, for example, “if the AI says they’re not paying attention and to all the A’s”.