Abstract
Social robots are becoming ubiquitous. In contrast with traditional robots, they are designed to interact with other humans through gestures, voice and even changes in mood. Advances in speech recognition and synthesis have promoted the development of conversational robots capable of interacting through relatively simple speech commands. We describe work aimed at developing conversational robots that adapt their interaction based on cues related to mood and behavior gathered from wearable sensors. In particular, we describe scenarios in the area of assisted living that have been developed using a participatory design process and a cloud-based architecture designed to gather, process and distribute sensor data to adapt the robot. The analysis of data gathered for two of these scenarios is analyzed to illustrate the feasibility of the approach.
Original language | English |
---|---|
Title of host publication | Mexican Conference on Human-Computer Interaction |
Pages | 1-6 |
DOIs | |
Publication status | Published (in print/issue) - 29 Oct 2018 |
Keywords
- Social robots
- biosignals
- wearable sensors
- adaptive robotinterfaces