Gesture Recognition on Google Glass

Google Glass is pretty good when it comes to voice recognition and touch, but there is absolutely no gesture control. You would think that a device like Google Glass would have gesture recognition on it, but it doesn’t, and I’m sure that has many geeks out there disappointed by that fact. But, don’t get too down in the dumps because gesture recognition on Google Glass might not be too far-off. Yup, you read that correctly.

A company in Oregon that goes by the name “OnTheGo Platforms” is looking to bring gesture recognition to Google Glass in the form of an SDK. The SDK is going to allow developers to integrate gesture recognition in to apps made for Glass and other Android-based smart glasses.

In the youtube video below you can see Alexis (from Engadget) demoing a photo-snapping and gallery app that has taken advantage of the SDK and gesture recognition. The app/sdk can recognize swipes from the left and right, a closed fist, and an open hand.

As you can imagine, the SDK still needs some refining and it will be a while before it’s perfect. Right now, the gestures have to be done slowly for Google Glass to pick them up, which isn’t really the best for real world scenarios. I mean, clenching a fist in front of your face for more than a second is a bit weird. If you have any interest in helping the company polish the SDK, then you can contact them for alpha access. Or, you could just wait a few months for the beta release to come out.

What Do You Think About Gesture Recognition?

Google Glass is pretty awkward to use in public as is, and I’m sure gesture recognition is just going to make it more awkward. Still, I think gesture recognition would be pretty dang handy. What do you think? Would you ever see yourself clenching a fist to take a picture or would you rather just press the button? In the current state of the sdk and that app, it would probably be faster to press the button, but still. Would you ever clench your fist to get a good selfie?