Designing with Gaze: Tama – A gaze-aware smart speaker platform
Research output: Contribution to journal › Journal article › Research › peer-review
Recent developments in gaze tracking present new opportunities for social computing. This paper presents a study of Tama, a gaze actuated smart speaker. Tama was designed taking advantage of research on gaze in conversation. Rather than being activated with a wake word (such as “Ok Google”) Tama detects the gaze of a user, moving an articulated ‘head’ to achieve mutual gaze. We tested Tama’s use in a multi-party conversation task, with users successfully activating and receiving a response to over 371 queries (over 10 trials). When Tama worked well, there was no significant difference in length of interaction. However, interactions with Tama had a higher rate of repeated queries, causing longer interactions overall. Video analysis lets us explain the problems users had interacting with gaze. In the discussion, we describe implications for designing new gaze systems, using gaze both as input and output. We also discuss how the relationship to anthropomorphic design and taking advantage of learned skills of interaction. Finally, two paths for future work are proposed, one in the field of speech agents, and the second in using human gaze as an interaction modality more widely.
Original language | English |
---|---|
Article number | 176 |
Journal | Proceedings of the ACM on Human-Computer Interaction |
Volume | 3 |
Issue number | CSCW |
ISSN | 2573-0142 |
DOIs | |
Publication status | Published - Nov 2019 |
Externally published | Yes |
Bibliographical note
Funding Information:
This work was supported by JSPS KAKENHI grant number 18H06473, Oki Electric Industry Co., Ltd., Vetenskapsrådet grant 2016-03843, and the Swedish Foundation for Strategic Research project RIT15-0046.
Publisher Copyright:
© 2019 Copyright held by the owner/author(s). Publication rights licensed to ACM.
- Gaze Detection, Gaze Interaction, Smart Speaker, Voice Assistant
Research areas
ID: 318207560