Iteratively Adapting Avatars using Task-Integrated Optimisation

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Virtual Reality allows users to embody avatars that do not match their real bodies. Earlier work has selected changes to the avatar arbitrarily and it therefore remains unclear how to change avatars to improve users' performance. We propose a systematic approach for iteratively adapting the avatar to perform better for a given task based on users' performance. The approach is evaluated in a target selection task, where the forearms of the avatar are scaled to improve performance. A comparison between the optimised and real arm lengths shows a significant reduction in average tapping time by 18.7%, for forearms multiplied in length by 5.6. Additionally, with the adapted avatar, participants moved their real body and arms significantly less, and subjective measures show reduced physical demand and frustration. In a second study, we modify finger lengths for a linear tapping task to achieve a better performing avatar, which demonstrates the generalisability of the approach.
OriginalsprogEngelsk
TitelProceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology
Udgivelsesstedhttps://dl.acm.org/doi/abs/10.1145/3379337.3415832
ForlagAssociation for Computing Machinery
Publikationsdato1 okt. 2020
Udgave1
Sider709–721
ISBN (Trykt)978-1-4503-7514-6
DOI
StatusUdgivet - 1 okt. 2020
Begivenhed3rd Annual ACM Symposium on User Interface Software and Technology - UIST' 20 - Virtuel Event, USA
Varighed: 20 okt. 202023 okt. 2020

Konference

Konference3rd Annual ACM Symposium on User Interface Software and Technology - UIST' 20
LandUSA
ByVirtuel Event
Periode20/10/202023/10/2020

ID: 253030789