Augmented reality views for occluded interaction

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Augmented reality views for occluded interaction. / Lilija, Klemen; Pohl, Henning; Boring, Sebastian; Hornbæk, Kasper.

CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 2019. 446.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Lilija, K, Pohl, H, Boring, S & Hornbæk, K 2019, Augmented reality views for occluded interaction. in CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems., 446, Association for Computing Machinery, 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, Glasgow, United Kingdom, 04/05/2019. https://doi.org/10.1145/3290605.3300676

APA

Lilija, K., Pohl, H., Boring, S., & Hornbæk, K. (2019). Augmented reality views for occluded interaction. In CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems [446] Association for Computing Machinery. https://doi.org/10.1145/3290605.3300676

Vancouver

Lilija K, Pohl H, Boring S, Hornbæk K. Augmented reality views for occluded interaction. In CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. 2019. 446 https://doi.org/10.1145/3290605.3300676

Author

Lilija, Klemen ; Pohl, Henning ; Boring, Sebastian ; Hornbæk, Kasper. / Augmented reality views for occluded interaction. CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 2019.

Bibtex

@inproceedings{ee415d6fdcb74ae5bc3859c7d97822af,
title = "Augmented reality views for occluded interaction",
abstract = "We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object{\textquoteright}s real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability.",
keywords = "Augmented reality, Finger-camera, Manipulation task",
author = "Klemen Lilija and Henning Pohl and Sebastian Boring and Kasper Hornb{\ae}k",
year = "2019",
doi = "10.1145/3290605.3300676",
language = "English",
booktitle = "CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems",
publisher = "Association for Computing Machinery",
note = "2019 CHI Conference on Human Factors in Computing Systems, CHI 2019 ; Conference date: 04-05-2019 Through 09-05-2019",

}

RIS

TY - GEN

T1 - Augmented reality views for occluded interaction

AU - Lilija, Klemen

AU - Pohl, Henning

AU - Boring, Sebastian

AU - Hornbæk, Kasper

PY - 2019

Y1 - 2019

N2 - We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object’s real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability.

AB - We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object’s real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability.

KW - Augmented reality

KW - Finger-camera

KW - Manipulation task

UR - http://www.scopus.com/inward/record.url?scp=85067633165&partnerID=8YFLogxK

U2 - 10.1145/3290605.3300676

DO - 10.1145/3290605.3300676

M3 - Article in proceedings

AN - SCOPUS:85067633165

BT - CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems

PB - Association for Computing Machinery

T2 - 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019

Y2 - 4 May 2019 through 9 May 2019

ER -

ID: 251262041