On Emotion Recognition Technology: Can Computer Read Your Mind? Campus News

emotion


On March 15th, 2016, in Seoul, regardless of defending his dignity by the only victory on March 13th, Korean Go grandmaster Lee Sedol lost the one-million-dollar bonus from Google after he was defeated by 1:4 during his five Go games against Google AI Player AlphaGo, meaning artificial intelligence has taken an important step in conquering Go Game, the pearl of wisdom created by human intelligence. 

On the same day, inside a SCUT Lab in Guangzhou, Professor Wen Guihua and his team were designing another game between an AI system and real people, unless they would not compete for Go games, but mind-reading skills. 

Lab of Machine Learning and Data Mining is an affiliated research facility at SCUT School of Computer Science and Engineering. It conducts researches on artificial intelligence including emotion recognition, neural network, clustering, deep learning, manifold learning and big data. 

On a computer screen inside the lab, faces of famous Chinese stars like Jay Chou and Fan bingbing are flickering, adding some fancy colors to the room that is still, literally, a serious scientific lab. 

“This machine is learning,” said by a graduate student majoring in computer science, “to learn how to identify faces on the pictures.”  

"I can tell if you are happy" 

Up till now, facial recognition technology has been widely used in criminal investigations and security checks. In Guangzhou, it was used in as early as 2008 to prevent exam impersonation and other potential cheating behaviors. 

But Wen’s team is trying to let it do more. They want the machine to learn to identify people’s emotion, and thus to discover their inner world through what is called “micro-expressions”. 

A micro-expression is a brief, involuntary facial expression shown on the face of humans according to emotions experienced. It is very brief in duration, lasting only 1/25 to 1/15 of a second. To capture these flashes, the computer has its nature advantages while real people may not be reacting and judging quickly enough. 

Wen has installed an APP developed by his team on his phone. He uses it to take pictures on himself, and then runs tests on the software to see if it is smart enough to know his mood. There are seven options for the APP to choose, whatever Wen is happy, sad, angry, scared, disgusted, surprised or no particular emotion. 

The APP’s accuracy can now be above 70%, absolutely not smart enough as an AI, but probably already more delicate than some real people in reading other’s mind. 

“We are going to hold a match between this APP and real people to see who is better in mind-reading,” Wen anticipated it much as he talked about the coming match. That’s because no one has ever specifically measured on human being’s ability in mind-reading. For example, in many occasions a gentleman may have troubles to know if a lady is in a good mood or not. 

"I may serve you everywhere" 

Surely the project is not just for fun or knowing ladies’ moods. Emotion recognition technology is expected to see concealed emotions that should be noticed and concerned by others. 

As a scenario in a recent public service advertisement suggests, when an old couple are both ill in hospital, they don’t want to distract their children from work, and so they keep telling the children that they are fine. 

In Wen’s plan, recognition technology, via an electronic camera, is expected to be able to capture the real condition of the old couple. 

“Once the percentage of negative emotions reaches certain level, the program would automatically alert their family that they need a visit.” 

Scientists also hope to integrate the technology with Traditional Chinese Medicine (TCM) so that the computer can judge people’s health condition by the four basic diagnostic methods of TCM, meaning “observing, listening, inquiring and pulse-measuring”, just as what a traditional Chinese doctor does. 

Wen also mentioned that the technology can be used in emotional education: “today’s kids focus on themselves too much, which leads them to show less attention to other people. Since this is not a good thing for children, how about starting to practice skills of knowing others with an AI?” 

"I will learn faster" 

These are just a few glances at what a machine with emotion recognition ability can do to our lives, and don’t be too surprised that your robot cleaner, which is another application of artificial intelligence, has already entered your life without a knock at the door. 

Maybe someone would say the cleaners are still clumsy, but so were the heavy-handed, non-smart mobile phones 20 years ago. 

One of the many things that are different from 20 years ago is big data changed the vision of developing artificial intelligence, as a process called “deep learning” has surfaced. 

“Human beings have changed their ways on teaching the computer. We used to give the computer conclusive knowledge, but now we  feed it with databases. As it looks into the data, it is programmed to analyze the reasons back from the phenomenon and results it sees. That’s how a computer can learn by itself,” said Wen. 

Different from the cloud computers that AlphaGo uses, his team only uses normal computers – this is the most fabulous and incredible part that even home PCs or smart phones can be equipped with artificial intelligence. 

PCs and phones are slower than super computers, but in Wen’s lab, software and codes are the core of the team's work. Only after that, speedy hardware would boost the process of learning. 

“The computer we currently use can study a picture in three seconds, which makes it over 20,000 pictures a day. If we upgrade the hardware of the server, it would learn much faster.”


 Original Chinese article from Yangcheng Evening News 

English version rewritten by Xu Peimu and Xu Fang

Edited by Xu Peimu


News Slider

Contact Info

wuhan��hubei��China
                           (WhatsApp and WeChat ID - Yu Cheng Tecnology available)      wecharwhatsapp
infoemail3_studyinchina@163.com

Some of our Universities