Skip to main content

Research Repository

Advanced Search

Towards IMACA: Intelligent multimodal affective conversational agent

Hussain, Amir; Cambria, Erik; Mazzocco, Thomas; Grassi, Marco; Wang, Qiu-Feng; Durrani, Tariq

Authors

Erik Cambria

Thomas Mazzocco

Marco Grassi

Qiu-Feng Wang

Tariq Durrani



Abstract

A key aspect when trying to achieve natural interaction in machines is multimodality. Besides verbal communication, in fact, humans interact also through many other channels, e.g., facial expressions, gestures, eye contact, posture, and voice tone. Such channels convey not only semantics, but also emotional cues that are essential for interpreting the message transmitted. The importance of the affective information and the capability of properly managing it, in fact, has been more and more understood as fundamental for the development of a new generation of emotion-aware applications for several scenarios like e-learning, e-health, and human-computer interaction. To this end, this work investigates the adoption of different paradigms in the fields of text, vocal, and video analysis, in order to lay the basis for the development of an intelligent multimodal affective conversational agent.

Presentation Conference Type Conference Paper (Published)
Conference Name International Conference on Neural Information Processing: ICONIP 2012
Start Date Nov 12, 2012
End Date Nov 15, 2012
Publication Date 2012
Deposit Date Sep 23, 2019
Publisher Springer
Volume 7663 LNCS
Pages 656-663
Series Title Lecture Notes in Computer Science
Series Number 7663
Book Title Neural Information Processing: 19th International Conference, ICONIP 2012, Doha, Qatar, November 12-15, 2012, Proceedings, Part I
ISBN 9783642344749
DOI https://doi.org/10.1007/978-3-642-34475-6_79
Keywords AI, HCI, Multimodal Sentiment Analysis
Public URL http://researchrepository.napier.ac.uk/Output/1793302