OVRlap: Perceiving Multiple Locations Simultaneously to Improve Interaction in VR

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

OVRlap : Perceiving Multiple Locations Simultaneously to Improve Interaction in VR. / Schjerlund, Jonas; Hornbæk, Kasper; Bergström, Joanna.

CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 2022. p. 1-13 355.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Schjerlund, J, Hornbæk, K & Bergström, J 2022, OVRlap: Perceiving Multiple Locations Simultaneously to Improve Interaction in VR. in CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems., 355, Association for Computing Machinery, pp. 1-13, 2022 CHI Conference on Human Factors in Computing Systems, CHI 2022, Virtual, Online, United States, 30/04/2022. https://doi.org/10.1145/3491102.3501873

APA

Schjerlund, J., Hornbæk, K., & Bergström, J. (2022). OVRlap: Perceiving Multiple Locations Simultaneously to Improve Interaction in VR. In CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (pp. 1-13). [355] Association for Computing Machinery. https://doi.org/10.1145/3491102.3501873

Vancouver

Schjerlund J, Hornbæk K, Bergström J. OVRlap: Perceiving Multiple Locations Simultaneously to Improve Interaction in VR. In CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. 2022. p. 1-13. 355 https://doi.org/10.1145/3491102.3501873

Author

Schjerlund, Jonas ; Hornbæk, Kasper ; Bergström, Joanna. / OVRlap : Perceiving Multiple Locations Simultaneously to Improve Interaction in VR. CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 2022. pp. 1-13

Bibtex

@inproceedings{043976fe855843c08f65fb70aa956076,
title = "OVRlap: Perceiving Multiple Locations Simultaneously to Improve Interaction in VR",
abstract = "We introduce OVRlap, a VR interaction technique that lets the user perceive multiple places at the same time from a first-person perspective. OVRlap achieves this by overlapping viewpoints. At any time, only one viewpoint is active, meaning that the user may interact with objects therein. Objects seen from the active viewpoint are opaque, whereas objects seen from passive viewpoints are transparent. This allows users to perceive multiple locations at once and easily switch to the one in which they want to interact. We compare OVRlap and a single-viewpoint technique in a study where 20 participants complete object-collection and monitoring tasks. We find that in both tasks, participants are significantly faster and move their head significantly less with OVRlap. We propose how the technique might be improved through automated switching of the active viewpoint and intelligent viewpoint rendering.",
keywords = "interaction techniques, large environments, user studies, virtual reality",
author = "Jonas Schjerlund and Kasper Hornb{\ae}k and Joanna Bergstr{\"o}m",
note = "Publisher Copyright: {\textcopyright} 2022 ACM.; 2022 CHI Conference on Human Factors in Computing Systems, CHI 2022 ; Conference date: 30-04-2022 Through 05-05-2022",
year = "2022",
doi = "10.1145/3491102.3501873",
language = "English",
pages = "1--13",
booktitle = "CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems",
publisher = "Association for Computing Machinery",

}

RIS

TY - GEN

T1 - OVRlap

T2 - 2022 CHI Conference on Human Factors in Computing Systems, CHI 2022

AU - Schjerlund, Jonas

AU - Hornbæk, Kasper

AU - Bergström, Joanna

N1 - Publisher Copyright: © 2022 ACM.

PY - 2022

Y1 - 2022

N2 - We introduce OVRlap, a VR interaction technique that lets the user perceive multiple places at the same time from a first-person perspective. OVRlap achieves this by overlapping viewpoints. At any time, only one viewpoint is active, meaning that the user may interact with objects therein. Objects seen from the active viewpoint are opaque, whereas objects seen from passive viewpoints are transparent. This allows users to perceive multiple locations at once and easily switch to the one in which they want to interact. We compare OVRlap and a single-viewpoint technique in a study where 20 participants complete object-collection and monitoring tasks. We find that in both tasks, participants are significantly faster and move their head significantly less with OVRlap. We propose how the technique might be improved through automated switching of the active viewpoint and intelligent viewpoint rendering.

AB - We introduce OVRlap, a VR interaction technique that lets the user perceive multiple places at the same time from a first-person perspective. OVRlap achieves this by overlapping viewpoints. At any time, only one viewpoint is active, meaning that the user may interact with objects therein. Objects seen from the active viewpoint are opaque, whereas objects seen from passive viewpoints are transparent. This allows users to perceive multiple locations at once and easily switch to the one in which they want to interact. We compare OVRlap and a single-viewpoint technique in a study where 20 participants complete object-collection and monitoring tasks. We find that in both tasks, participants are significantly faster and move their head significantly less with OVRlap. We propose how the technique might be improved through automated switching of the active viewpoint and intelligent viewpoint rendering.

KW - interaction techniques

KW - large environments

KW - user studies

KW - virtual reality

UR - http://www.scopus.com/inward/record.url?scp=85130531476&partnerID=8YFLogxK

U2 - 10.1145/3491102.3501873

DO - 10.1145/3491102.3501873

M3 - Article in proceedings

AN - SCOPUS:85130531476

SP - 1

EP - 13

BT - CHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems

PB - Association for Computing Machinery

Y2 - 30 April 2022 through 5 May 2022

ER -

ID: 309121492