Jose L. Part
Towards visual dialogue for human-robot interaction
Part, Jose L.; Hernández García, Daniel; Yu, Yanchao; Gunson, Nancie; Dondrup, Christian; Lemon, Oliver
Authors
Daniel Hernández García
Dr Yanchao Yu Y.Yu@napier.ac.uk
Lecturer
Nancie Gunson
Christian Dondrup
Oliver Lemon
Abstract
The goal of the EU H2020-ICT funded SPRING project is to develop a socially pertinent robot to carry out tasks in a gerontological healthcare unit. In this context, being able to perceive its environment and have coherent and relevant conversations about the surrounding world is critical. In this paper, we describe current progress towards developing the necessary integrated visual and conversational capabilities for a robot to operate in such environments. Concretely, we introduce an architecture for conversing about objects and other entities present in an environment. The work described in this paper has applications that extend well beyond healthcare and can be used on any robot that requires to interact with its visual and spatial environment in order to be able to perform its duties.
Citation
Part, J. L., Hernández García, D., Yu, Y., Gunson, N., Dondrup, C., & Lemon, O. (2021, March). Towards visual dialogue for human-robot interaction. Presented at HRI '21: ACM/IEEE International Conference on Human-Robot Interaction, Boulder, CO, USA
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | HRI '21: ACM/IEEE International Conference on Human-Robot Interaction |
Start Date | Mar 8, 2021 |
End Date | Mar 11, 2021 |
Online Publication Date | Mar 8, 2021 |
Publication Date | 2021-03 |
Deposit Date | Jun 27, 2023 |
Publisher | Association for Computing Machinery (ACM) |
Pages | 670-672 |
Book Title | HRI '21 Companion: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction |
ISBN | 978-1-4503-8290-8 |
DOI | https://doi.org/10.1145/3434074.3447278 |
Keywords | visual dialogue, robotics, conversational systems |
Public URL | http://researchrepository.napier.ac.uk/Output/3125707 |
You might also like
How Much do Robots Understand Rudeness? Challenges in Human-Robot Interaction
(2024)
Presentation / Conference Contribution
TaskMaster: A Novel Cross-platform Task-based Spoken Dialogue System for Human-Robot Interaction
(2023)
Presentation / Conference Contribution
MoDEsT: a Modular Dialogue Experiments and Demonstration Toolkit
(2023)
Presentation / Conference Contribution
A Visually-Aware Conversational Robot Receptionist
(2022)
Presentation / Conference Contribution
The CRECIL Corpus: a New Dataset for Extraction of Relations between Characters in Chinese Multi-party Dialogues
(2022)
Presentation / Conference Contribution
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search