Gazeprojector: accurate gaze estimation and seamless gaze interaction across multiple displays
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Standard
Gazeprojector : accurate gaze estimation and seamless gaze interaction across multiple displays. / Lander, Christian; Gehring, Sven; Krüger, Antonio; Boring, Sebastian; Bulling, Andreas.
UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. Association for Computing Machinery, 2015. p. 395-404.Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Gazeprojector
T2 - 28th Annual ACM Symposium on User Interface Software and Technology, UIST 2015
AU - Lander, Christian
AU - Gehring, Sven
AU - Krüger, Antonio
AU - Boring, Sebastian
AU - Bulling, Andreas
PY - 2015
Y1 - 2015
N2 - Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situations remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker?s position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seamless gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user?s position and orientation to the display. In a user study with 12 participants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.
AB - Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situations remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker?s position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seamless gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user?s position and orientation to the display. In a user study with 12 participants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.
KW - Calibration
KW - Eye tracking
KW - Gaze estimation
KW - Large displays
KW - Multi-display environments
KW - Natural feature tracking
U2 - 10.1145/2807442.2807479
DO - 10.1145/2807442.2807479
M3 - Article in proceedings
AN - SCOPUS:84959287354
SP - 395
EP - 404
BT - UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
PB - Association for Computing Machinery
Y2 - 8 November 2015 through 11 November 2015
ER -
ID: 159819296