Mobility and large displays – University of Copenhagen

Mobility and large displays

Picture from Touch Projector project

TouchProjector: In 1992, Tani et al. envisioned how users could interact with a real-world device located at a distance through live video. Cameras observed industrial machinery and allowed users to manipulate mechanical switches and sliders over a distance by clicking and dragging within the live video image with a mouse. This was made possible by mapping portions of the video frame to the respective parts of the remote hardware. The system was revolutionary in that it established a particularly direct type of affordance – in many ways similar to the affordance of direct touch. Read more -->

VirtualProjection: Portable projectors in mobile devices provide a promising way to overcome screen-space limitations on handhelds, navigate information, or augment reality. One of their appeals is the simplicity of interaction: Aiming at an appropriate surface projects the image, and changing posture and direction adjusts the image's position and orientation. This behavior is purely based on optics, allowing us to intuitively grasp it based on our own experience with the physical world. However, strict adherence to the laws of physics also has its drawbacks: The intensity of light varies with the projector's distance to the surface, and the projected image is tightly coupled to the projector's movement. Read more -->

Publication: Making public displays interactive everywhere (PDF)

Interacting with media facades (IRIS): More and more urban landscapes are equipped with media facades. The Beijing National Aquatics Center in Beijing, China and the ARS Electronica Center in Linz, Austria are two prominent examples out of hundreds of such facades. However, due to their size and the required viewing distance, interacting with them directly (e.g., by touching them) is normally impossible. Recent advances in mobile computing allow users to interact with such facades in several ways. Current interaction approaches include controlling pointers on the facade's canvas or pushing content to it through multimedia messages. Read more -->

GazeProjector (PDF): We allow for interacting using a user's gaze independently of the user's position and orientation to a display and across multiple displays. This has never been done before, because eye-trackers always required a fixed position and orientation between the user (or more so the head) and a display. Read more (PDF) -->

Picture from Virtual Projector project