Gazeprojector: accurate gaze estimation and seamless gaze interaction across multiple displays

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Gazeprojector : accurate gaze estimation and seamless gaze interaction across multiple displays. / Lander, Christian; Gehring, Sven; Krüger, Antonio; Boring, Sebastian; Bulling, Andreas.

UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. Association for Computing Machinery, 2015. p. 395-404.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Lander, C, Gehring, S, Krüger, A, Boring, S & Bulling, A 2015, Gazeprojector: accurate gaze estimation and seamless gaze interaction across multiple displays. in UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. Association for Computing Machinery, pp. 395-404, 28th Annual ACM Symposium on User Interface Software and Technology, UIST 2015, Charlotte, United States, 08/11/2015. https://doi.org/10.1145/2807442.2807479

APA

Lander, C., Gehring, S., Krüger, A., Boring, S., & Bulling, A. (2015). Gazeprojector: accurate gaze estimation and seamless gaze interaction across multiple displays. In UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (pp. 395-404). Association for Computing Machinery. https://doi.org/10.1145/2807442.2807479

Vancouver

Lander C, Gehring S, Krüger A, Boring S, Bulling A. Gazeprojector: accurate gaze estimation and seamless gaze interaction across multiple displays. In UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. Association for Computing Machinery. 2015. p. 395-404 https://doi.org/10.1145/2807442.2807479

Author

Lander, Christian ; Gehring, Sven ; Krüger, Antonio ; Boring, Sebastian ; Bulling, Andreas. / Gazeprojector : accurate gaze estimation and seamless gaze interaction across multiple displays. UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. Association for Computing Machinery, 2015. pp. 395-404

Bibtex

@inproceedings{c289022d85e74d049ba8e55071279f1a,
title = "Gazeprojector: accurate gaze estimation and seamless gaze interaction across multiple displays",
abstract = "Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situations remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker?s position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seamless gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user?s position and orientation to the display. In a user study with 12 participants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.",
keywords = "Calibration, Eye tracking, Gaze estimation, Large displays, Multi-display environments, Natural feature tracking",
author = "Christian Lander and Sven Gehring and Antonio Kr{\"u}ger and Sebastian Boring and Andreas Bulling",
year = "2015",
doi = "10.1145/2807442.2807479",
language = "English",
pages = "395--404",
booktitle = "UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology",
publisher = "Association for Computing Machinery",
note = "28th Annual ACM Symposium on User Interface Software and Technology, UIST 2015 ; Conference date: 08-11-2015 Through 11-11-2015",

}

RIS

TY - GEN

T1 - Gazeprojector

T2 - 28th Annual ACM Symposium on User Interface Software and Technology, UIST 2015

AU - Lander, Christian

AU - Gehring, Sven

AU - Krüger, Antonio

AU - Boring, Sebastian

AU - Bulling, Andreas

PY - 2015

Y1 - 2015

N2 - Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situations remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker?s position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seamless gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user?s position and orientation to the display. In a user study with 12 participants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.

AB - Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situations remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker?s position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seamless gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user?s position and orientation to the display. In a user study with 12 participants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.

KW - Calibration

KW - Eye tracking

KW - Gaze estimation

KW - Large displays

KW - Multi-display environments

KW - Natural feature tracking

U2 - 10.1145/2807442.2807479

DO - 10.1145/2807442.2807479

M3 - Article in proceedings

AN - SCOPUS:84959287354

SP - 395

EP - 404

BT - UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology

PB - Association for Computing Machinery

Y2 - 8 November 2015 through 11 November 2015

ER -

ID: 159819296