Does your computer know how you’re feeling?
Researchers in Bangladesh have designed a computer programme that can accurately recognise users’ emotional states as much as 87% of the time, depending on the emotion.
Writing in the journal Behaviour & Information Technology, A.F.M. Nazmul Haque Nahin and his colleagues describe how their study combined – for the first time – two established ways of detecting user emotions: keystroke dynamics and text-pattern analysis.
To provide data for the study, volunteers were asked to note their emotional state after typing passages of fixed text, as well as at regular intervals during their regular (‘free text’) computer use; this provided the researchers with data about keystroke attributes associated with seven emotional states (joy, fear, anger, sadness, disgust, shame and guilt). To help them analyse sample texts, the researchers made use of a standard database of words and sentences associated with the same seven emotional states.
After running a variety of tests, the researchers found that their new ‘combined’ results were better than their separate results; what’s more, the ‘combined’ approach improved performance for five of the seven categories of emotion. Joy (87%) and anger (81%) had the highest rates of accuracy.
This research is an important contribution to ‘affective computing’, a growing field dedicated to ‘detecting user emotion in a particular moment’. As the authors note, for all the advances in computing power, performance and size in recent years, a lot more can still be done in terms of their interactions with end users. “Emotionally aware systems can be a step ahead in this regard,” they write.
“Computer systems that can detect user emotion can do a lot better than the present systems in gaming, online teaching, text processing, video and image processing, user authentication and so many other areas where user emotional state is crucial.”
While much work remains to be done, this research is an important step in making ‘emotionally intelligent’ systems that recognise users’ emotional states to adapt their music, graphics, content or approach to learning a reality.
NOTE TO JOURNALISTS
When referencing the article: Please include Journal title, Author, published by Taylor & Francis and the following statement:
* Read the full article online: http://www.tandfonline.com/doi/full/10.1080/0144929X.2014.907343
Follow us on Twitter @tandfnewsroom
If you would like further information, please contact:
Marketing Executive, Taylor & Francis Journals
About Taylor & Francis Group
Taylor & Francis Group partners with researchers, scholarly societies, universities and libraries worldwide to bring knowledge to life. As one of the world’s leading publishers of scholarly journals, books, ebooks and reference works our content spans all areas of Humanities, Social Sciences, Behavioural Sciences, Science, and Technology and Medicine.
From our network of offices in Oxford, New York, Philadelphia, Boca Raton, Boston, Melbourne, Singapore, Beijing, Tokyo, Stockholm, New Delhi and Johannesburg, Taylor & Francis staff provide local expertise and support to our editors, societies and authors and tailored, efficient customer service to our library colleagues.