Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Gesture-based interaction: the future of input technology?

Can gesture-based input methods emulate or even surpass the connections we make with touch input? What sort of applications will gesture control bring? Read on, as we take a 3 dimensional adventure into the world of gesture-based interaction.
By

Published onMay 10, 2013

samsung galaxy s4 vs htc one s4 air view aa
Samsung Galaxy S4 – Air View

The way we interact with our devices determines more than what we can do with them. Smartphones became popular not just because of their ability to connect us with the rest of the world, but also because of the connections we built with them, thanks primarily to touch input. Just as the mouse accelerated portable computing, touch input accelerated the growth of smartphones and tablets. When Steve Jobs walked up to announce the iPhone, he spoke of the connection we were about to experience thanks to touch input.

[quote qtext=”We’re going to use the best pointing device in the world. We’re going to use a pointing device that we’re all born with – born with ten of them. We’re going to use our fingers.” qperson=”” qsource=”” qposition=”center”]

He was right too, you’re more likely to use a device that you feel connected to, rather than one that makes you feel alienated. A keyboard and mouse seem so disconnected, but actually touching the screen of your phone brought a physical connection to the table. Apple may not have been the first company to use touch as an input method, but it was certainly the most successful. Accelerate to the year 2013 and a new trend is appearing in the form gesture-based interaction, showing us just how quickly technology adapts and changes.

But can gesture-based input methods emulate or even surpass the connections we feel with touch input? What sort of applications will gesture control bring? Read on, as we take a three-dimensional adventure into the world of gesture based interaction.

kinect_002

Kinect

Possibly the most popular form of gesture-based input is Kinect. It’s easy to forget that Kinect is only a little over two years old and in that short period of time, Microsoft has sold over 24 million units.

Kinect utilises an RGB camera, a depth sensor and multi array microphone, allowing it to provide full-body 3D motion capture, as well as voice and facial recognition. For a good look at just how Kinect works, check out the video below:

Kinect has some wonderful applications away from gaming and Skyping, especially in the field of medicine. Researchers at the University of Minnesota have used Kinect to measure disorder symptoms like autism and obsessive-compulsive disorder, in children. Kinect’s potential is sure to expand as more developers jump on board and with the Xbox 720 coming soon, the Kinect 2 may just be on the way too.

Leap Motion

Smaller than an iPhone and thinner than a Macbook Air, Leap Motion is a nifty gesture based device that plugs into your PC via USB, and attempts to bring the desktop back into the 21st century.

Using hand gestures, you are able to control your PC just like you would with a touchscreen or a mouse, but what is revolutionary about this product is that the gestures are based on actions we do in everyday life. If you’re in the mood to transform your room into Hogwarts, check out the demonstration video below:

Mum’s the word when it comes to the exact technology embedded within the Leap controller, but what the developers will tell us is that it can track in-air movements down to 1/100th of a millimeter, meaning it is 200 times more sensitive than Kinect. WOW!

Leap Motion also has a few big names signed up to use its technology, with ASUS and HP pairing up with the company to bundle the technology in their PCs. Leap Motion also plans to bring the technology to tablets and phones, so I’m definitely holding my breath. With it’s ability to sense multiple fingers, hands and objects, Leap has an incredible future ahead of it. The implementations are endless, from the boardroom table to the emergency room, the future is bright.

Samsung and all the S-(insert name here) stuff

Samsung has shown an incredible amount of interest in gesture-based interaction, beginning with the Samsung Galaxy S2 and becoming an ever present feature in the Galaxy S3 and Galaxy S4. Gesture input was even ported to Samsung’s Smart TV line up. What began as a simple “turn to mute” gesture, turned into an asphyxiation with gesture based interfacing that was heightened when the S4 was announced.

If the plethora of camera features weren’t enough to satisfy your insatiable hunger, than the over abundance of ways to control the Galaxy S4 were sure to calm your senses. The features from the S2 and S3 remained, but they were taken to new levels, with “Air View” and “Air Gesture”, proving that you didn’t even have to touch your phone to interact with it. Perfect for those countless times you’ve had suntan lotion, or juicy ribs sauce slathered on your fingers. Check out Samsung’s Galaxy S4 advertisement below, if you’re not truly convinced that “Air Gestures” are the future of mobile interaction.

The technology game is a fast moving business and Samsung isn’t resting on its laurels, so it has already began developing a method of interaction using nothing but your mind. This could help people with disabilities better interact with their phones and give them better access to the internet. If you want to learn more on how Samsung is planning on transforming us all into Professor X, check out the full article here.

SixthSense

A major difference between SixthSense and other gesture-based technologies, is that its goal is to merge the physical and digital world into one. What began as a simple contraption using nothing but the rollers in a mouse and some pulleys, has transformed into a neck worn pendant, complete with a projector and a camera.

SixthSense allows you to convert a paper map into a digital one, transform a piece of paper into a tablet and pull information of pieces of paper and into your computer. Through gestures SixthSense can take photos, zoom in or pan on a map and even transform your wrist into an analog watch. Check out founder Pranav Mistry’s TED talk for a complete look into the fascinating technology.

Wrap up

Gesture-based interaction is here to stay. Interacting with your devices in 3D space is a special, almost surreal kind of feeling and the applications for gesture based input are limitless. With brilliant contraptions like Leap Motion and SixthSense, the future looks dazzlingly bright for gesture-based input.

Do you ever use Kinect on your Xbox? How about “Air Gestures” on your Galaxy S4? Interested in Leap and SixthSense? Let us know in the comments below.