Best daily deals

Links on Android Authority may earn us a commission. Learn more.

MindRDR is mind-control for your Google Glass

Wouldn't it be cool to control Google Glass or other wearable devices just by thinking of a command in your mind? An open-source app called MindRDR might just be the solution.
July 11, 2014
MindRDR google glass

Wearable devices are the next big trend in mobile devices, but these currently have one inherent limitation: user interface. To date, users can interact with wearables like Google Glass through voice commands or touch-based interfaces. But wouldn’t it be cool to control your wearable device just by thinking of a command in your mind? An open-source app called MindRDR might just be the solution.

Developed by a startup called This Place, MindRDR basically enables users to execute tasks on devices through brain activity. The app works in conjunction with a sensor like the Neurosky EEG biosensor headset, which translates brain activity into signals that can be read by electronic gadgets. The MindRDR app acts as a go-between for a mobile handset or wearable — in this case, Google Glass — and the biosensor headset.

The current iteration of MindRDR is currently limited to taking photos and uploading these on Facebook and Twitter. The app appears as a horizontal white line on Google Glass, which moves upward on the screen as the user concentrates. With enough concentration, Glass takes a shot. Repeating the process — concentrating until the white line reaches the top of the screen — will then command Glass to upload the photo to either Facebook or Twitter account, whichever is configured.

Limitations aside, the developers at This Place have released the source code, in order for other developers or teams to build on the mind-reading concept. Concentrating is a crude yet novel approach to controlling Glass, and with better brain wave sensitivity, developers might be able to expand on its use. Perhaps developers can also enable Glass to work with other forms of interaction, perhaps including three-dimensional motion.

What would be interesting is for Google to actually build brain wave-sensitivity on Glass itself, to reduce the number of add-on devices needed for this purpose. An added headset might be necessary for using MindRDR on devices work on other parts of the body, such as smartwatches or clothing-based wearables. But eyeglasses are already a perfect place to incorporate a portable EEG, which can enable such devices to read brainwaves.

Beyond novelty, however, one possible practical use of MindRDR is for folks who are limited in terms of motion or who have an inability to interact with devices through motion, touch or voice. The other possible concern for everyone, however, would be whether this is too intrusive as a user interface. Who knows whether Google or another company is already eavesdropping on what we’re thinking?