The blind can now “see” with Google Glass

March 21, 2013
0 70 193 5
    google glass girl

    Yes, I can see you now! (Credit: DVF)

    Google launched its Glass project with the developer community earlier this year, and there has been much buzz about what the platform can do. But while Glass applications mostly add an augmented-reality visual layer of data meant for seeing, one big potential application would be enabling the visually impaired to “see.”

    A Google Glass developer is building tools to enable the blind to see through audible means. The system is intended to work like sonar — the positioning technology used in submarines and by some sea mammals — in determining the position of objects around the person. There are actually existing technologies and applications that use the same principle, although these have limitations. For instance, the vOICe app for Android will basically describe things around you through speech. The app tries to identify objects and proximity through the Android device’s camera.

    However, the limitation is that the camera actually needs to “see” the environment, and you need to wear earphones. That’s not exactly a practical solution. Firstly, it’s not hands-free, unless you can mount your smartphone on your body or clothing. Secondly, earphones can be quite cumbersome. Here’s where Google Glass comes into play.

    Glass comes with two things that will be important in this application:

    • First is the head-mounted camera, which moves along with your own head movements.
    • Second is the use of bone conduction to transfer audio to your inner ear. This means no more need for cumbersome earphones and headphones. Just wear the Glass headset and you’re good to go.

    This assumes, of course, that the headset is paired with an Android device for sight-to-sound translation. A current limitation, too, is that sight-to-sound works best with stereo headphones, which gives the brain a better way of positioning the sound relative to a three-dimensional space. Otherwise, it might just end up confusing because you will hear an object description, but will not know where exactly it is.

    This new development will not exactly restore vision, although it’s a good substitute until that time in the distant future when we can directly interface our devices with our brains. Check out the video for a sample of how vOICe gives audible cues (like “pings”) to help navigate through a 3D space.

    You can also check out the vOICe for Android project page (in the source links) for a better description of how the application works. If you’re curious as to the capitalization, think of OIC as “oh, I see.”

    0 70 193

    Comments

    • cronic

      Geordi La Forge VISOR become reality

    • magnifico17

      at least the blind are not totally blind thanks to google.

    • LeDerpino

      DAREDEVIL BRO

    • Phillip

      Thanks for Google!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Please MAKE IT HAPPEN!!!!!!!

    • Martijn van der Spek

      While we are waiting for glass, we have implemented the app ‘Talking Goggles’. It is available on the Apple app store or Google Play, and we are getting very positive feedback from the visually impaired community. Check it at sparklingapps.com/goggles . It basically speaks out anything it sees with your smartphone camera.

    Popular

    Latest