Iteratively Adapting Avatars using Task-Integrated Optimisation

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Virtual Reality allows users to embody avatars that do not match their real bodies. Earlier work has selected changes to the avatar arbitrarily and it therefore remains unclear how to change avatars to improve users' performance. We propose a systematic approach for iteratively adapting the avatar to perform better for a given task based on users' performance. The approach is evaluated in a target selection task, where the forearms of the avatar are scaled to improve performance. A comparison between the optimised and real arm lengths shows a significant reduction in average tapping time by 18.7%, for forearms multiplied in length by 5.6. Additionally, with the adapted avatar, participants moved their real body and arms significantly less, and subjective measures show reduced physical demand and frustration. In a second study, we modify finger lengths for a linear tapping task to achieve a better performing avatar, which demonstrates the generalisability of the approach.
Original languageEnglish
Title of host publicationProceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology
Place of Publicationhttps://dl.acm.org/doi/abs/10.1145/3379337.3415832
PublisherAssociation for Computing Machinery
Publication date1 Oct 2020
Edition1
Pages709–721
ISBN (Print)978-1-4503-7514-6
DOIs
Publication statusPublished - 1 Oct 2020
Event3rd Annual ACM Symposium on User Interface Software and Technology - UIST' 20 - Virtuel Event, United States
Duration: 20 Oct 202023 Oct 2020

Conference

Conference3rd Annual ACM Symposium on User Interface Software and Technology - UIST' 20
LandUnited States
ByVirtuel Event
Periode20/10/202023/10/2020

ID: 253030789