Tuesday, October 25, 2011

Paper Reading #23: User-Defined Motion Gestures for Mobile Interaction

Jaime Ruiz is a PhD student of HCI at the University of Waterloo
Yang Li is a senior researcher scientist at Google.
Edward Lank is an assistant professor of computer science at the University of Waterloo.

This paper was presented at CHI 2011

Summary


Hypothesis
Researchers set out to prove that certain motion gesture sets are more natural than others in mobile interaction.

Methods
In this research experiment, researchers essentially allowed participants to create the gestures to be investigated. The authors created a list of tasks of several different types. The tasks would either fall into the "action" or "navigation" category. These categories were then subdivided into smaller tasks.

Participants were given the full list of tasks to be completed as well as a smartphone. They were instructed to complete the simple task (like pretend to answer a call, or go to the homescreen) with the idea that the phone would know how to execute the task.

The phone they were using was equipped with specially designed software to recognize and keep track of gestures performed by the participant.

After the test, the user was asked to comment on the gestures they used and whether they were easy to perform or good match for it's use.



Results
Through the test, researchers discovered that many users preferred and performed the same gesture.

The authors found several trends in the description of motion gestures. They found that many of the gestures mimic normal use. For example, 17 out of 20 participants answered the phone by putting it to their ear. The gestures also use real-world metaphors.

Through all these results, the authors were able to analyze the various created gestures and create a taxonomy of these natural gestures.

Discussion

Research like this, in my opinion, should occur more often. By giving participants free-range of their actions, they can come up with whatever input or gestures are natural to them. After analyzing these gestures, we can figure out which are natural.

By creating more studies like this, we can find out what types of input are preferred in a system. For example, do most people prefer to execute gesture XYZ to access the menu or do they prefer to use gesture ABC instead.

No comments:

Post a Comment