Teaser-Picture

Abstract

We present a novel machine learning based algorithm extending the interaction space around mobile devices. The technique uses only the RGB camera now commonplace on off-the-shelf mobile devices. Our algorithm robustly recognizes a wide range of in-air gestures, supporting user variation, and varying lighting conditions. We demonstrate that our algorithm runs in real-time on unmodified mobile devices, including resource-constrained smartphones and smartwatches.
Our goal is not to replace the touchscreen as primary input device, but rather to augment and enrich the existing interaction vocabulary using gestures. While touch input works well for many scenarios, we demonstrate numerous interaction tasks such as mode switches, application and task management, menu selection and certain types of navigation, where such input can be either complemented or better served by in-air gestures. This removes screen real-estate issues on small touchscreens, and allows input to be expanded to the 3D space around the device. We present results for recognition accuracy (93% test and 98% train), impact of memory footprint and other model parameters. Finally, we report results from preliminary user evaluations, discuss advantages and limitations and conclude with directions for future work.


Accompanying Video


Dataset

Upon popular request we released the training and test data used during this project. This data consists of 7 different static gestures (6 meaningful gestures and 1 no gesture class). Each gesture has raw RGB images, clean segmentation and noisy segmentation.

  • RGB: RGB images (raw data; no segmentation)
  • Clean segmentation: Binary images of clean hand segmentation
  • Noisy segmentation: Binary iamges of segmented hand and artificial segmentation noise

The data covers variations both of hand size and hand orientation and hence is quite challenging to recognize. There are two sets: one for training and one for testing.


Published at

ACM User Interface Software and Technology Symposium, 2014

Project Links

Bibtex

@inproceedings{Song:2014:InAirGestures, author = {Song, Jie and Soros, Gabor and Pece, Fabrizio and Fanello, Sean Ryan and Izadi, Shahram and Keskin, Cem and Hilliges, Otmar}, title = {{In-air Gestures Around Unmodified Mobile Devices{}, booktitle = {{Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology}}, series = {{UIST '14}}, url = {http://doi.acm.org/10.1145/2642918.2647373}, }