Monday, September 26, 2011

Paper Reading #12: Enabling Beyond-Surface Interactions for Interactive Surface with an Invisible Projection

Li-Wei Chan, Hsiang-Tao, Hui-Shan Kao, Ju-Chun Ko, Home-Ru Lin, Mike Y. Chen are graduate students at the National Taiwan University.
Jane Hsu is a computer science professor at the National Taiwan University.
Yi-Ping Hung is also a professor at the National Taiwan University.

This paper was presented at UIST 2010.

Summary


Hypothesis
The researchers set out to prove that interactions can occur beyond the surface of a touch display. By using infrared project, users can use other devices to interact with the scene.

Methods
In order to facilitate the hypothesis, researchers had to create their own custom table design. The table makes use of multiple IR cameras as well as color project layer coupled with a IR projection layer. The color project projects the image onto the color projection which is seen by the human eye. However, the IR projection layer also has data which allows other devices to interact with it.
By providing this IR projection layer, mobile devices can easily determine their orientation to the table.

The researchers also provided three various ways to interact with the table.
The first was the i-m-Lamp. The lamp has a pico projector and an IR camera attached to it. When the lamp is pointed at the screen, the IR camera detects where it is looking and it figures out what to overlay over the color screen.
The i-m-flashlight is very similar to the i-m-lamp but it was used to provide a more dynamic way (as opposed to the more static lamp).
The final interaction method was the i-m-view. The view was a tablet with a camera that could detect what it was looking at and display a 3D projection of a given map.

Results


Through initial results, they found that the users used the lamp like static object (as predicted) and use the flashlight to quickly select object and display relevant information.
The flashlight was a much more dynamic object for the participants.
The view was interesting from the participants perspective but lacked some features when the user would try and look at the 3D buildings a certain way (the view could no longer detect what it was looking at since it couldn't see the table).

Discussion


The system that was created by the researchers could easily find a place in the field of augmented reality in my opinion. In fact, the i-m-view was really an experiment in augmented reality.

I think an interesting path to take this kind of technology would be to use mobile phones to interact with the large table, as well.
I envision a board game (or something similar) where players can share the same view but the mobile device could provide a unique view to the player based on where they're looking on the board.
This kind of technology has many different applications as well. For example, imagine looking at a large phone directory on your table surface. By hovering your phone over the table, it could detect what name you were looking at and present a "Call Number?" message.

No comments:

Post a Comment