Gesture Based Computing
What is Gesture Based Computing?
Gesture based computing is computing that is triggered by human gestures rather than from devices like a controller, keyboard or mouse. Most people are familiar with games like the Wii where a controlling device creates interactions on a digital screen. Microsoft will release its gesture based game Kinect in November of 2010. Kinect allows users to interact with games and content using only body motion and gestures. Full body tracking in the program makes the user the controlling device. Other devices like the iPhone and iPad allow a user to manipulate objects with simple finger movements. Gesture control allows a more simplified and natural interaction. In teaching and learning multi-touch displays and gesture based applications engage learners in new ways, allow for simulations that mirror the real world, and allow for collaborative interaction. It puts learning into the hands of the user.
According to the 2010 Horizon Report, examples of gesture based computing include:
- Georgia Tech University researchers who have developed gesture-based games to help deaf children learn linguistics.
- The Sixth Sense project from MIT provides a gesture interface that can be used to augment information into real world spaces.
- Wii based medical training-After discovering the significant improvement in dexterity that surgeons-in-training gained from playing with the Wii (48%), researchers are developing a set of Wii-based medical training materials.
Gesture Based Example
Next up, see Visual Data Analysis