Chapter 9. Towards Improving User Experience and Shared Task Performance with Mobile Robots through Parameterized Nonverbal State Sonification
Authors: Liam Roy, Richard Attfield, Dana Kuli ́c, and Elizabeth Croft
Abstract: Given the correct context, nonverbal interaction can express high-level information with greater universality, efficiency, and appeal than spoken words. The focus of this work is to develop a simple nonverbal communication strategy for conveying high-level robot information to facilitate human-robot collaboration. We propose a low-dimensional parameterized communication model based on nonverbal sounds (NVS). The proposed model functions by modulating a fixed number of parameters of a base sound in an attempt to communicate distinguishable high-level robot states. A valence-arousal mapping is used to characterize the continuous axes of the proposed two-dimensional parameterized model. The developed communication model is validated using an online interactive survey designed to explore how well the model communicates high-level robot information, and how this communication modality affects the user’s experience and shared task performance. Specifically, we investigated three parameters: participants’ perceived understanding of the robot whilst observing an interaction video, their willingness to continue observing interactions with the robot, and their estimation of the suitability of the proposed communication model for the given context. The results of this study provide insight and direction concerning the use of simplified NVS communication for human-robot collaboration. In addition, this work builds support for the development of a positive feedback loop through this modality, encompassing positive user experience, increased interest in subsequent interaction, and increased collaborative performance via familiarization.
Videos
15 Video Playlist