Affiliate links on Android Authority may earn us a commission. Learn more.
Is this the future of multi-touch on mobile?
Multi-touch was a true breakthrough for mobile and for smartphones in particular. But beyond the familiar swipe, pinch, zoom and rotation, there are some that think we’ve barely begun to tap the massive potential multi-touch has to offer. Take this multi-touch offering called TouchTools that allows you to virtually pull up all manner of different on-screen tools with intuitive gestures.
Mountain View-based Qeexo developed the software to enhance existing hardware. That means these gestures work on the current crop of devices and don’t require expensive upgrades or brand new technology. That alone is a big deal for manufacturers (and users for that matter). But it’s the awesomeness of the tools themselves that really make TouchTools stand out.
TouchTools uses a range of natural gestures, typically related to how you would use the object in real life. The software layer recognizes the touchscreen gesture and brings up the relevant on-screen tool. Tools include things as handy as a ruler or tape measure, magnifying glass or camera and even a mouse or eraser. It’s pretty crazy to watch, mostly because it works so well it looks like the on-screen element is actually there in the user’s hands.
But it’s not just as simple as recognizing where the fingers are located on the screen. TouchTools uses machine learning to understand the orientation of the hand and rotation of the fingertips to accurately recognize the gesture. In case this all sounds like pie-in-the-sky stuff, Qeexo is also the company responsible for FingerSense, the technology behind Huawei’s Knuckle Sense gestures.
TouchTools are already available to manufacturers and developers and are not just restricted to tablets either. Qeexo also sees potential for digital signage, virtual whiteboards and vehicular applications. With Project Soli hopefully making it into real world applications this year too, we’ll soon have a number of entirely new ways to interact with our devices.
Would do you think of TouchTools? How do you see the future of device interaction?