Designing with Gaze: Tama – A gaze-aware smart speaker platform
Research output: Contribution to journal › Journal article › Research › peer-review
Standard
Designing with Gaze : Tama – A gaze-aware smart speaker platform. / McMillan, Donald; Brown, Barry; Kawaguchi, Ikkaku; Jaber, Razan; Belenguer, Jordi Solsona; Kuzuoka, Hideaki.
In: Proceedings of the ACM on Human-Computer Interaction, Vol. 3, No. CSCW, 176, 11.2019.Research output: Contribution to journal › Journal article › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Designing with Gaze
T2 - Tama – A gaze-aware smart speaker platform
AU - McMillan, Donald
AU - Brown, Barry
AU - Kawaguchi, Ikkaku
AU - Jaber, Razan
AU - Belenguer, Jordi Solsona
AU - Kuzuoka, Hideaki
N1 - Funding Information: This work was supported by JSPS KAKENHI grant number 18H06473, Oki Electric Industry Co., Ltd., Vetenskapsrådet grant 2016-03843, and the Swedish Foundation for Strategic Research project RIT15-0046. Publisher Copyright: © 2019 Copyright held by the owner/author(s). Publication rights licensed to ACM.
PY - 2019/11
Y1 - 2019/11
N2 - Recent developments in gaze tracking present new opportunities for social computing. This paper presents a study of Tama, a gaze actuated smart speaker. Tama was designed taking advantage of research on gaze in conversation. Rather than being activated with a wake word (such as “Ok Google”) Tama detects the gaze of a user, moving an articulated ‘head’ to achieve mutual gaze. We tested Tama’s use in a multi-party conversation task, with users successfully activating and receiving a response to over 371 queries (over 10 trials). When Tama worked well, there was no significant difference in length of interaction. However, interactions with Tama had a higher rate of repeated queries, causing longer interactions overall. Video analysis lets us explain the problems users had interacting with gaze. In the discussion, we describe implications for designing new gaze systems, using gaze both as input and output. We also discuss how the relationship to anthropomorphic design and taking advantage of learned skills of interaction. Finally, two paths for future work are proposed, one in the field of speech agents, and the second in using human gaze as an interaction modality more widely.
AB - Recent developments in gaze tracking present new opportunities for social computing. This paper presents a study of Tama, a gaze actuated smart speaker. Tama was designed taking advantage of research on gaze in conversation. Rather than being activated with a wake word (such as “Ok Google”) Tama detects the gaze of a user, moving an articulated ‘head’ to achieve mutual gaze. We tested Tama’s use in a multi-party conversation task, with users successfully activating and receiving a response to over 371 queries (over 10 trials). When Tama worked well, there was no significant difference in length of interaction. However, interactions with Tama had a higher rate of repeated queries, causing longer interactions overall. Video analysis lets us explain the problems users had interacting with gaze. In the discussion, we describe implications for designing new gaze systems, using gaze both as input and output. We also discuss how the relationship to anthropomorphic design and taking advantage of learned skills of interaction. Finally, two paths for future work are proposed, one in the field of speech agents, and the second in using human gaze as an interaction modality more widely.
KW - Gaze Detection
KW - Gaze Interaction
KW - Smart Speaker
KW - Voice Assistant
UR - http://www.scopus.com/inward/record.url?scp=85075061050&partnerID=8YFLogxK
U2 - 10.1145/3359278
DO - 10.1145/3359278
M3 - Journal article
AN - SCOPUS:85075061050
VL - 3
JO - Proceedings of the ACM on Human-Computer Interaction
JF - Proceedings of the ACM on Human-Computer Interaction
SN - 2573-0142
IS - CSCW
M1 - 176
ER -
ID: 318207560