Skip to content

Public engagement project using interactive human-in-the-loop online machine learning to teach machines to recognise human emotion (in development)

License

Notifications You must be signed in to change notification settings

DynamicGenetics/The-Learning-Machine

Repository files navigation

Can We Teach Computers to Understand Emotion?

Researching the Digital Generation

Each of us lays down invisible digital footprints as we move around and interact with the modern world. Our movements are logged by our phones and sat navs, our activity is monitored by wearable sensors, and our social media interactions form part of a worldwide conversation.

This shared digital information allows us to engage with the world in ways that were unthinkable for previous generations. Businesses use this huge amount of data to understand their customers and offer services such as finding the best route through a busy city or suggesting products you might like based on what you’ve just bought. But our digital footprints are useful for health researchers too.

They can use digital information shared by people taking part in their research to understand important issues such as the spread of disease, or the development of mental health problems. Some types of information, such as the emotions we express on social media, are easy for most humans to understand. But health researchers often work with groups of thousands of people, like Bristol’s famous Children of the Nineties study. It would be impossible and intrusive for researchers to read and interpret every social media post themselves. Computers, on the other hand, work extremely quickly, but have difficulty with some tasks that humans typically find straightforward.

Can human and artificial intelligence work together on this, combining the strengths of both to improve human health? Or, to put it another way, can we teach computers to understand emotion?

The Learning Machine

The learning machine is one component of our Curiosity Toolkit element which aims at demonstrating how computers can work alongside humans to learn how to understand emotion, using both emotional faces and a curated set of tweets.

As a standalone experience, each visitor begins with a naïve machine that randomly sorts faces or words to different categories.

They begin to sort the faces or words into categories and the machine begins to learn how to recognise emotions in faces or words, updating its understanding of the categories each time the visitor makes a correction.

This is an example of how machines and humans can work together: humans have expert knowledge of emotions, but machines are faster at processing images and words.

With human-in-the-loop online machine learning we can get the best from both humans and machines.

Demos

Instructions on how to setup and initialise demos, are reported here

Credits and Acknowledgments

This project and the Curiosity Toolkit have been developed by the team at Dynamic Genetics Lab in partnership with the Jean Golding Institute, the Public Engagement team at the University of Bristol, and We The Curious.

The project is part of the Curiosity Challenge Competition promoted by the Jean Golding Institute for Data Science and the Alan Turing Institute.

The Curiosity toolkits will contribute to the forthcoming Open City Lab exhibition at We The Curious. More information here.

DGL Logo JGI Logo UoB Logo WTC Logo ATI Logo

About

Public engagement project using interactive human-in-the-loop online machine learning to teach machines to recognise human emotion (in development)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages