Sensing Foot Gestures from the Pocket
Jeremy Scott is currently a graduate student at MIT but previously earned his bachelor of science at the University of Toronto.
David Dearman is a PhD student at the University of Toronto and is focusing on HCI
Koji Yatani is also currently a PhD student at the University of Toronto with an interest in HCI.
Khai N. Truong is an associate professor at the University of Toronto involved in HCI.
This paper was presented at UIST 2010.
Summary
Hypothesis
The main hypothesis of this paper was that foot gestures could provide a means of eyes-free input with the gestures being interpreted from a pocket by a cell phone.
Methods
The researchers attempted to prove their hypothesis but completing two studies.
In the first study, they had participants make various foot gesture that were recorded and studied by a complex camera system. The four gestures they tested were the dorsiflexion, plantar flexion, heel rotation, and toe rotation. They had participants try to select a target by making the specified gesture.
In the next study, they used a mobile device and it's accelerometer to detect foot gestures. Each participant wore three mobile devices in the front, the side, and the back. The user would initiate a foot gesture by "double tapping" their foot before using the gesture. The two gestures they focused on was the heel rotation and the plantar rotation.
Results
In the first study, they found that smaller angles were generally more accurate in the target selection. They also found that the participants preferred the heel rotation as it was more comfortable.
In the second study, they found out that the phone in the side pocket, in general, gave much higher success rates. Over all gestures, they found that the side pocket placement of the iPhone gave an accuracy rate of 85.7%.
Based on this high accuracy, they concluded that using foot gestures was indeed a viable input for a eyes-free device. They also began to prepare for future studies using foot gestures.
Discussion
While this is mainly a reiteration of a point they discussed in the paper, having a foot sensor that could control your phone would be very helpful. For example, if you're sitting in class or standing around having a conversation and you have an incoming call, it'd be very helpful to automatically forward the call to voicemail by making a discreet foot gesture.
A furthering of this study might be to remove the phone from the detection system. Perhaps create a small unobtrusive sensor that the user could put in his shoes that could interface with the phone. That way, one could get even higher success rates (the sensor would be right at the foot as opposed on the hip like in the study). Having the phone act as a sensor is very interesting but a simple, wireless detection sensor wouldn't be obtrusive or complicated either.
No comments:
Post a Comment