Touchscreens are very accessible and they’re easy to use, but how do we take advantage of the technology to perform more complex interactions? Multi-touch and gestures have opened up new possibilities, but what if the screen could tell what was interacting with it? That’s the idea behind Qeexo’s FingerSense – the device can tell the difference between a fingertip, a fingernail, or a knuckle, as well as a stylus.
The technology behind this is about measuring vibrations using an acoustic sensor, or a smartphone’s built-in microphone. By determining the difference between parts of the finger and parts of a stylus FingerSense can be used to create dedicated controls. A good example would be a knuckle tap serving the same function as a right mouse button and enabling a pop-up menu for options. The technology could also open up more possibilities for game control input or for art packages.
The great advantage here is to remove the necessity for menu systems and layers of choice by enabling simple one touch actions mapped to specific controls. A double knuckle tap could be a customizable shortcut for opening an email or a notepad app. The video below shows it off perfectly, so take a look.
Qeexo is a startup that developed out of a project at Carnegie Mellon University’s Human-Computer Interaction Institute. The company is based in San Jose, California and it aims to have FingerSense rolling out on smartphones within a year. The range of potential applications for this just goes on and on so let’s hope it rolls out soon.
Like this post? Share it!
I would really like to have that painting app. Now!
WOW, the Note 2!
That’s a galaxy s3 not note 2. Lol