In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

In the Arms of a Robot : Designing Autonomous Hugging Robots with Intra-Hug Gestures. / Block, Alexis E.; Seifi, Hasti; Hilliges, Otmar; Gassert, Roger; Kuchenbecker, Katherine J.

I: ACM Transactions on Human-Robot Interaction, Bind 12, Nr. 2, 3526110, 2023.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Block, AE, Seifi, H, Hilliges, O, Gassert, R & Kuchenbecker, KJ 2023, 'In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures', ACM Transactions on Human-Robot Interaction, bind 12, nr. 2, 3526110. https://doi.org/10.1145/3526110

APA

Block, A. E., Seifi, H., Hilliges, O., Gassert, R., & Kuchenbecker, K. J. (2023). In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures. ACM Transactions on Human-Robot Interaction, 12(2), [3526110]. https://doi.org/10.1145/3526110

Vancouver

Block AE, Seifi H, Hilliges O, Gassert R, Kuchenbecker KJ. In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures. ACM Transactions on Human-Robot Interaction. 2023;12(2). 3526110. https://doi.org/10.1145/3526110

Author

Block, Alexis E. ; Seifi, Hasti ; Hilliges, Otmar ; Gassert, Roger ; Kuchenbecker, Katherine J. / In the Arms of a Robot : Designing Autonomous Hugging Robots with Intra-Hug Gestures. I: ACM Transactions on Human-Robot Interaction. 2023 ; Bind 12, Nr. 2.

Bibtex

@article{e54dc1b490424be891085f6fb2604659,
title = "In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures",
abstract = "Hugs are complex affective interactions that often include gestures like squeezes. We present six new guidelines for designing interactive hugging robots, which we validate through two studies with our custom robot. To achieve autonomy, we investigated robot responses to four human intra-hug gestures: holding, rubbing, patting, and squeezing. A Total of 32 users each exchanged and rated 16 hugs with an experimenter-controlled HuggieBot 2.0. The robot's inflated torso's microphone and pressure sensor collected data of the subjects' demonstrations that were used to develop a perceptual algorithm that classifies user actions with 88% accuracy. Users enjoyed robot squeezes, regardless of their performed action, they valued variety in the robot response, and they appreciated robot-initiated intra-hug gestures. From average user ratings, we created a probabilistic behavior algorithm that chooses robot responses in real time. We implemented improvements to the robot platform to create HuggieBot 3.0 and then validated its gesture perception system and behavior algorithm with 16 users. The robot's responses and proactive gestures were greatly enjoyed. Users found the robot more natural, enjoyable, and intelligent in the last phase of the experiment than in the first. After the study, they felt more understood by the robot and thought robots were nicer to hug. ",
keywords = "Additional Key Words and PhrasesSocial-physical human-robot interaction, behavioral algorithm, haptic sensing, user study",
author = "Block, {Alexis E.} and Hasti Seifi and Otmar Hilliges and Roger Gassert and Kuchenbecker, {Katherine J.}",
note = "Funding Information: This work was partially supported by the Max Planck ETH Center for Learning Systems and the IEEE RAS Technical Committee on Haptics Funding Information: This work was partially supported by the Max Planck ETH Center for Learning Systems and the IEEE RAS Technical Committee on Haptics. Publisher Copyright: {\textcopyright} 2023 Copyright held by the owner/author(s).",
year = "2023",
doi = "10.1145/3526110",
language = "English",
volume = "12",
journal = "ACM Transactions on Human-Robot Interaction",
issn = "2573-9522",
publisher = "Association for Computing Machinery (ACM)",
number = "2",

}

RIS

TY - JOUR

T1 - In the Arms of a Robot

T2 - Designing Autonomous Hugging Robots with Intra-Hug Gestures

AU - Block, Alexis E.

AU - Seifi, Hasti

AU - Hilliges, Otmar

AU - Gassert, Roger

AU - Kuchenbecker, Katherine J.

N1 - Funding Information: This work was partially supported by the Max Planck ETH Center for Learning Systems and the IEEE RAS Technical Committee on Haptics Funding Information: This work was partially supported by the Max Planck ETH Center for Learning Systems and the IEEE RAS Technical Committee on Haptics. Publisher Copyright: © 2023 Copyright held by the owner/author(s).

PY - 2023

Y1 - 2023

N2 - Hugs are complex affective interactions that often include gestures like squeezes. We present six new guidelines for designing interactive hugging robots, which we validate through two studies with our custom robot. To achieve autonomy, we investigated robot responses to four human intra-hug gestures: holding, rubbing, patting, and squeezing. A Total of 32 users each exchanged and rated 16 hugs with an experimenter-controlled HuggieBot 2.0. The robot's inflated torso's microphone and pressure sensor collected data of the subjects' demonstrations that were used to develop a perceptual algorithm that classifies user actions with 88% accuracy. Users enjoyed robot squeezes, regardless of their performed action, they valued variety in the robot response, and they appreciated robot-initiated intra-hug gestures. From average user ratings, we created a probabilistic behavior algorithm that chooses robot responses in real time. We implemented improvements to the robot platform to create HuggieBot 3.0 and then validated its gesture perception system and behavior algorithm with 16 users. The robot's responses and proactive gestures were greatly enjoyed. Users found the robot more natural, enjoyable, and intelligent in the last phase of the experiment than in the first. After the study, they felt more understood by the robot and thought robots were nicer to hug.

AB - Hugs are complex affective interactions that often include gestures like squeezes. We present six new guidelines for designing interactive hugging robots, which we validate through two studies with our custom robot. To achieve autonomy, we investigated robot responses to four human intra-hug gestures: holding, rubbing, patting, and squeezing. A Total of 32 users each exchanged and rated 16 hugs with an experimenter-controlled HuggieBot 2.0. The robot's inflated torso's microphone and pressure sensor collected data of the subjects' demonstrations that were used to develop a perceptual algorithm that classifies user actions with 88% accuracy. Users enjoyed robot squeezes, regardless of their performed action, they valued variety in the robot response, and they appreciated robot-initiated intra-hug gestures. From average user ratings, we created a probabilistic behavior algorithm that chooses robot responses in real time. We implemented improvements to the robot platform to create HuggieBot 3.0 and then validated its gesture perception system and behavior algorithm with 16 users. The robot's responses and proactive gestures were greatly enjoyed. Users found the robot more natural, enjoyable, and intelligent in the last phase of the experiment than in the first. After the study, they felt more understood by the robot and thought robots were nicer to hug.

KW - Additional Key Words and PhrasesSocial-physical human-robot interaction

KW - behavioral algorithm

KW - haptic sensing

KW - user study

U2 - 10.1145/3526110

DO - 10.1145/3526110

M3 - Journal article

AN - SCOPUS:85164244102

VL - 12

JO - ACM Transactions on Human-Robot Interaction

JF - ACM Transactions on Human-Robot Interaction

SN - 2573-9522

IS - 2

M1 - 3526110

ER -

ID: 373668579