Skip to main content

Research Repository

Advanced Search

Emotions and messages in simple robot gestures

Li, Jamy; Chignell, Mark; Mizobuchi, Sachi; Yasumura, Michiaki

Authors

Mark Chignell

Sachi Mizobuchi

Michiaki Yasumura



Abstract

Understanding how people interpret robot gestures will aid design of effective social robots. We examine the generation and interpretation of gestures in a simple social robot capable of head and arm movement using two studies. In the first study, four participants created gestures with corresponding messages and emotions based on 12 different scenarios provided to them. The resulting gestures were then shown in the second study to 12 participants who judged which emotions and messages were being conveyed. Knowledge (present or absent) of the motivating scenario (context) for each gesture was manipulated as an experimental factor. Context was found to assist message understanding while providing only modest assistance to emotion recognition. While better than chance, both emotion (22%) and message understanding (40%) accuracies were relatively low. The results obtained are discussed in terms of implied guidelines for designing gestures for social robots.

Citation

Li, J., Chignell, M., Mizobuchi, S., & Yasumura, M. (2009, July). Emotions and messages in simple robot gestures. Presented at International Conference on Human-Computer Interaction, San Diego, USA

Presentation Conference Type Conference Paper (published)
Conference Name International Conference on Human-Computer Interaction
Start Date Jul 19, 2009
End Date Jul 24, 2009
Publication Date 2009
Deposit Date May 7, 2024
Publisher Springer
Peer Reviewed Peer Reviewed
Volume 5611
Pages 331-340
Series Title Lecture Notes in Computer Science
Series ISSN 0302-9743
Book Title Human-Computer Interaction. Novel Interaction Methods and Techniques. HCI 2009
ISBN 9783642025761
DOI https://doi.org/10.1007/978-3-642-02577-8_36
Keywords Human-Robot Interaction, Gestures, Social Robots, Emotion