Generating Spatial Attention Cues via Illusory Motion
Research output: Contribution to conference › Paper › Research › peer-review
Standard
Generating Spatial Attention Cues via Illusory Motion. / Belongie, Serge; Jensen, Janus Nørtoft; Hannemose, Morten; Wilm, Jakob; Dahl, Anders Bjorholm; Frisvad, Jeppe Revall.
2019. Paper presented at Third Workshop on Computer Vision for AR/VR, Long Beach, United States.Research output: Contribution to conference › Paper › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - CONF
T1 - Generating Spatial Attention Cues via Illusory Motion
AU - Belongie, Serge
AU - Jensen, Janus Nørtoft
AU - Hannemose, Morten
AU - Wilm, Jakob
AU - Dahl, Anders Bjorholm
AU - Frisvad, Jeppe Revall
PY - 2019
Y1 - 2019
N2 - For many applications in augmented reality (AR), the user has a much more enjoyable experience if the AR system is able to properly guide the user’s attention. In this extended abstract, we explain how to create patterns of light that when projected onto an object are perceived as if the object itself is moving. This can be used as a spatial attention cue. We accomplish this with a calibrated projector camera setup to synthesize an image from the projector’s point of view. This image is filtered to create local phase changes that are then projected back onto the object and perceived as motion. Our method will be shown as a live demonstration at the CV4AR/VR workshop at CVPR 2019.
AB - For many applications in augmented reality (AR), the user has a much more enjoyable experience if the AR system is able to properly guide the user’s attention. In this extended abstract, we explain how to create patterns of light that when projected onto an object are perceived as if the object itself is moving. This can be used as a spatial attention cue. We accomplish this with a calibrated projector camera setup to synthesize an image from the projector’s point of view. This image is filtered to create local phase changes that are then projected back onto the object and perceived as motion. Our method will be shown as a live demonstration at the CV4AR/VR workshop at CVPR 2019.
M3 - Paper
T2 - Third Workshop on Computer Vision for AR/VR
Y2 - 17 June 2019 through 17 June 2019
ER -
ID: 304776524