Daniel Hernandez Garcia
Explainable Representations of the Social State: A Model for Social Human-Robot Interactions
Hernandez Garcia, Daniel; Yu, Yanchao; Sieinska, Weronika; Part, Jose L.; Gunson, Nancie; Lemon, Oliver; Dondrup, Christian
Authors
Dr Yanchao Yu Y.Yu@napier.ac.uk
Lecturer
Weronika Sieinska
Jose L. Part
Nancie Gunson
Oliver Lemon
Christian Dondrup
Abstract
In this paper, we propose a minimum set of concepts and signals needed to track the social state during Human-Robot Interaction. We look into the problem of complex continuous interactions in a social context with multiple humans and robots, and discuss the creation of an explainable and tractable representation/model of their social interaction. We discuss these representations according to their representational and communicational properties, and organize them into four cognitive domains (scene-understanding, behaviour-profiling, mental-state, and dialogue-grounding).
Citation
Hernandez Garcia, D., Yu, Y., Sieinska, W., Part, J. L., Gunson, N., Lemon, O., & Dondrup, C. Explainable Representations of the Social State: A Model for Social Human-Robot Interactions
Working Paper Type | Preprint |
---|---|
Publication Date | Oct 9, 2020 |
Deposit Date | Jun 28, 2023 |
Public URL | http://researchrepository.napier.ac.uk/Output/3125802 |
Related Public URLs | https://arxiv.org/pdf/2010.04570.pdf |
You might also like
How Much do Robots Understand Rudeness? Challenges in Human-Robot Interaction
(2024)
Presentation / Conference Contribution
TaskMaster: A Novel Cross-platform Task-based Spoken Dialogue System for Human-Robot Interaction
(2023)
Presentation / Conference Contribution
MoDEsT: a Modular Dialogue Experiments and Demonstration Toolkit
(2023)
Presentation / Conference Contribution
A Visually-Aware Conversational Robot Receptionist
(2022)
Presentation / Conference Contribution
The CRECIL Corpus: a New Dataset for Extraction of Relations between Characters in Chinese Multi-party Dialogues
(2022)
Presentation / Conference Contribution