Nobody in mobile technology seems to want you to touch your mobile device anymore. Apple would rather you speak to your phone, Samsung wants you to use your eyes, as well as hovering your fingers above the screen and, as for Google, it doesn’t even want the mobile phone to be in your hands because that’s “too emasculating” (says the guy with a screen in front of his eyeball).
But a quick look at the majority of sci-fi movies brings us to the conclusion that the feature that some people would really want to use for input is telekinesis or mind control. The very thought of it (pun intended), sends shivers down my spine. The possibilities are endless. Checking and replying to emails without ever touching your phone, maybe even without pulling the device out of your pocket, becomes possible. Gaming could be taken to another level, because lets face it, sometimes those controllers just don’t respond fast enough.
More importantly, people with disabilities could use mobile devices much more effectively and that’s where Samsung steps in.
Samsung has decided that the plethora of ways to interact with mobile devices is simply not enough. For those who aren’t keeping count, there are four ways of interacting with Samsung’s latest flagship phone, the Galaxy S4. There are touch, eye, gestures and voice control all crammed into the slim 5-inch frame of the Galaxy S4. So what other avenue of device interaction can Samsung explore? Well, telekinesis seemed like a good place to start.
Now before you think somebody at Samsung has read one too many X-Men comics, they are actually a lot closer than you’d think. In fact Samsung, with the help of assistant professor of electrical engineering Roozbeh Jafari at the University of Texas, has actually got a crude method working with a Samsung Galaxy Note 10.1 (to find out exactly how Samsung is utilizing the technology, check out the video at the Source link at the end of this article):
To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency.
We here at Android Authority don’t like to be party poopers, but for the sake of saving you some money (in buying a Charles Xavier costume in anticipation, who by the way did not possess the power of telekinesis), we’d like to inform you that the current method is cumbersome and requires the use of bulky headgear (which ironically resembles the headgear Professor X uses in Cerebro).
Samsung isn’t using any new technology for this project, as it relies on EEG monitoring electrodes to get the job done. However, it is its interest in creating more ways for you to interact with your device that is of paramount importance here. While the clear emphasis is on people with disabilities, the ultimate goal of Samsung’s Emerging Technology Lab is to broaden the ways in which all people can interact with their smart devices.
Currently, the technology has a success rate of 80 to 95 percent, which many would claim is a better success rate than voice control, and allows users to make selections every five seconds. Samsung is hoping to make the headgear less obtrusive so that people can wear it throughout the day.
Even with all the promising development, we can’t say that telekinesis will be “The Next Big Thing” on the Galaxy S5 or any other future mobile device from Samsung.
Is telekinesis top of your features list for phones? Is this just another arrow in Samsung’s featuritis bow? Would you like to see the input method in any other devices (Google Glass)?