Helping machines speak our language

Helping machines speak our language

Researchers have developed technology to detect emotions in human speech, enabling more natural conversations with robots.

Current voice-activated technology used in virtual assistants is limited by its inability to decipher human emotions, causing it to provide irrelevant responses or miss the point of some conversations entirely.

But a team of researchers from RMIT University’s School of Engineering led by Associate Professor Margaret Lech has discovered how to add emotional capabilities to machines to make communication more natural and more socially acceptable.

“There’s always an emotional context when we talk to people and we understand it, but machines don’t understand this,” says Lech.

“When we call an automatic call centre, for example, people get very frustrated because they talk to the machine and it does not understand that they are sad, they are anxious, that they want things to be done quickly.

“They don’t understand the emotion associated with the task and we can hear from many recordings people saying, ‘I want to talk to a person, not a machine’.

“There is no way to explain certain things to a machine, including those subtle cues that we can express through emotions when we talk to each other.”

Lech and her team have spent 11 years creating new machine learning techniques that allow technology to understand human emotions from speech signals, and to analyse and predict patterns of emotional interactions in human conversations.

Equipped with these capabilities, voice-activated devices can now understand both the linguistic and emotional contents of speech, and provide appropriate responses. They can read seven human emotions: anger, boredom, disgust, fear, happiness, sadness and neutral.

The challenge of making machines read human emotions lay in measuring the unspoken commands in voices such as subtle changes in tone, volume and speed.

23 January 2019


Associate Professor Margaret Lech Associate Professor Margaret Lech specialises in language recognition for computers.

Emotion recognition will unlock many more applications and wider benefits from  voice-activated technology, says Lech.

“People will accept machines more, they will trust machines, they will have the feeling that the machine really understands them and can help them better.

“People, especially the elderly, will not be so reluctant to use automatic call centres. Then we can employ machines, for example, robots as companions. An older person may like actually talking to a machine and hear that the machine can laugh with her, can sympathise, and understand her feelings.

“It could also be good if used for kids’ toys. Children will interact with robotic toys that can talk emotionally, so children will learn more about emotions.” 

Lifelike artificial intelligence may have once been a futuristic dream.

But with developments like this making human-machine interactions simpler and smoother, the future has seemingly arrived.

Be part of the conversation about Ethical Innovation & Industry Transformation! Join local and international leaders from industry, research and innovation, 18 - 20 February at RMIT. Find out more at Engaging for Impact 2019.

Story: Michael Quin

23 January 2019


  • Research
  • Science and technology
  • Defence

Related News

aboriginal flag
torres strait flag

Acknowledgement of Country

RMIT University acknowledges the people of the Woi wurrung and Boon wurrung language groups of the eastern Kulin Nation on whose unceded lands we conduct the business of the University. RMIT University respectfully acknowledges their Ancestors and Elders, past and present. RMIT also acknowledges the Traditional Custodians and their Ancestors of the lands and waters across Australia where we conduct our business - Artwork 'Luwaytini' by Mark Cleaver, Palawa.