Affiliate links on Android Authority may earn us a commission. Learn more.
Qeexo FingerSense: When taps and swipes are not enough
We’ve gone a long way in terms of phone interfaces, far beyond the multi-tap keypads, predictive Tegic T9 and the Navi Key of past decades. We can argue that the paradigm shift in smartphone user interfaces started with phone-powered PDAs like the Compaq iPaq of the early 2000’s, but it’s the first iPhone that made a revolutionary change in terms of touch-based UIs. Now, Android devices have proliferated in the market across all price ranges.
But one thing still remains: the touch-based user interface is still modeled after the same tapping, swiping, pinching and rotating gestures we have all grown accustomed to.
Don’t you feel this is limiting? Could we do more with our smartphone screens than just type and swipe to interact with on-screen elements? A startup called Qeexo plans to change the way we interact with our smartphones by providing an improved means for interacting with the touchscreen using different surfaces or parts of the hand.
Qeexo’s FingerSense technology will be able to determine the difference among a fingertip, knuckle, fingernail or stylus, among others, expanding the ways we could interact with on-screen elements on smartphones, tablets and other touch-enabled devices. For example, you can use your knuckle to tap on the screen and it will understand that you want to select text, without having to undertake other steps.
“You can imagine it’d be like having different buttons in your hand,” says Sang Won Lee, co-founder and CEO at the San Jose, CA-based startup, which has so far raised $2.3 million in seed funding.
Why reinvent the wheel?
Touch-enabled devices today have evolved quite well, past the single-touch resistive technologies of old. With capacitive screens and ever-thinner glass displays, the experience is richer and more intuitive. However, in many ways, it has grown old. The creators originally thought of revolutionizing touchscreen design as two other co-founders, Chris Harrison and Julia Schwarz, were pursuing their PhDs in Human-Computer Interaction Institute at Carnegie Mellon. It started out as an experiment with identifying objects through different vibration patterns.
The resulting technology is Qeexo’s FingerSense, which identifies different objects through the varying vibration patterns using the device’s accelerometer. For example, your fingertip establishes a different vibration pattern when it touches the screen, compared to, say, your knuckle, or even another fingertip.
Think of it as akin to having different mouse buttons. On your desktop or notebook computer, left-click is different with right-click. On some systems, like Macs, even, the whole concept of left- and right-clicking has been given a different interpretation altogether — either through keyboard-trackpad combinations or clicking with multiple fingers.
With FingerSense, different kinds of taps, drags, swipes and stretches could mean different things, too. A knuckle could bring out a context menu. An eraser could, well, erase on-screen text or drawings.
Not just phones
Going beyond smartphones and tablets, of course, FingerSense could make its way to automotive technology or even wearable devices, which offer an even more limited means of interactivity. Cars require the user to concentrate on driving, rather than fiddle with on-screen controls. Thus, knocking and tapping with different parts of the hand may be used to toggle certain car settings through a heads-up display, for example. Or, smartwatch might interact differently when tapped with your finger as opposed to being knocked with your knuckle.
Of course, the question here is whether device-makers will warm up to the technology in the first place. If companies like Google and Apple are keen on exploring alternative touchscreen technologies, then the startup might even be on their radars for a potential acquisition.
For smartphone users, we’ll have to watch out and see which technologies make their way into our devices in the future. Will it be evolutionary changes like a new way of tapping on touchscreens? Or will it be a big and drastic change, like how majority of the smartphone world suddenly shifted from numeric and QWERTY keypads to full touch interfaces?