A Flood of New FCC Documents Surface on Apple-Verizon iPhone
Apple Wins their Third "Light Display" Patent in Three Months

Next Generation Interfaces Give New Meaning to Mind Control

1 - Cover - next gen interfaces give new meaning to mind control 
In December 2009 we explored MIT's Pranav Mistry's SixthSense wearable device that merged an iPod-like device with air-gesturing capabilities as part of a future interface concept. In June 2010 we delved into John Underkoffler's interface vision involving what he calls "g-speak - a Spatial Operating Environment." In today's report, we'll provide you with a brief 3D Internet update and wander into the workings of a futuristic interface that gives new meaning to the term Mind Control.  


A 3D Internet Update


In October 2009, we presented you with a report titled "Intel, Apple Working on 3D Internet." In that report we presented you with Intel's initial vision for advancing the internet's interface so as to include 3D Internet experiences. We explained at the time that these capabilities would be brought about beginning with their Sandy Bridge architecture. Fast forwarding to Intel's introduction of Sandy Bridge at CES 2011, we see that while they're still committed to their vision for 3D internet experiences that begins with Sandy Bridge – they also realistically note that the vision has been pushed out to the 2014-2015 timeframe.


2B - 3D Internet Update Graphic - Jan 2011 

Intel's Shmuel (Mooly) Eden, VP and general manager of the PC Client Group, introduced Sandy Bridge at a CES 2011 presentation. Sandy Bridge opens the door to future interface advances that will relate to social networking and gaming. With gaming, you and your friends will be able to place your faces on the very characters that you wish to play in order to personalize games at a much deeper level. See the second half of the video presented below that illustrates that very point in "live" motion. You could also see the bottom left photo above of Mooly Eden turned into an avatar in the bottom right photo.




 Are Dramatic Shifts Coming to the User Interface?


Yet the key to Eden's presentation was found in his closing forecast: "In three, four years, the way that we're going to communicate with our devices is going to be totally different. When we look at the keyboard and the screen the way they are today, they'll look like the middle ages."


In context with the "g-speak interface" that we presented to you back in June 2010, you could easily understand what Eden was alluding to. Of course Intel has to raise the power bar considerably going forward and at a much faster pace if they're to keep the desktop relevant in a world that is quickly shifting away from the desktop and choosing to go with all things mobile (smartphones and tablets).


But if you thought that the "g-speak" interface was really "out there," then this next interface may blow you away. Just bear in mind that I'm not suggesting that we'll be seeing this next generation interface for everyday usage – but I do think that we could find this type of interface as a supplement or competitor to gaming systems like the Wii and Kinect. Then again, who knows how quickly this interface could advance.


Mind Control: A Future User Interface


Tan Le is the co-founder and president of Emotiv Systems, a firm that's working on a new form of remote control that uses brainwaves to control digital devices and digital media. It's long been a dream to bypass the mechanical (mouse, keyboard and clicker) and have our digital devices response directly to what we think. Emotiv's recently released EPOC headset uses 16 sensors to listen to activity across the entire brain. Software "learns" what each user's brain activity looks like when one, for instance, imagines a left turn or a jump.


3 - Emotiv mind control systems - on a MacBook at TED 

Up until now, our communication with machines has always been limited to conscious and direct forms. Whether it's something simple like turning on the lights with a switch, or even as complex as programming robotics, we have always had to give a command to a machine, or even a series of commands, in order for it to do something for us. Communication between people on the other hand, is far more complex and a lot more interesting, because we take into account so much more than what is explicitly expressed. We observe facial expressions, body language, and we can intuit feelings and emotions from our dialogue with one another. This actually forms a large part of our decision-making process. Emotiv's vision is to introduce this whole new realm of human interaction into human-computer interaction, so that computers could understand not only what you direct it to do, but it could also respond to your facial expressions and emotional experiences. And what better way to do this than by interpreting the signals naturally produced by our brain, our center for control and experience.


Although I could continue to describe this system, you have to see it in action for yourself in order to believe it. In the video presented below, you'll see this fascinating mind control interface come to life on a MacBook. Is it at a primitive stage? – Perhaps. But you're definitely going to scratch your head at the fact that this interface has even progressed to the point that it has. You might even think that it's just a Vegas magic act, but it's not. They already have software developer kits for gaming developers and EEG researchers.


The Emotiv System Presented at TED on a MacBook



One of the more practical applications for this type of interface would be assisting the handicapped control an electric wheelchair with simple mind controls. This is presented as the last example in the noted video.   


  V2 -- Dartmouth college iphone experiments with the iPhone 

In another Emotiv video between the 15:00 to 15:16 time marks we're told that the Ivy League Dartmouth College is working on the Emotiv system with an iPhone. The idea is to just look at the one that you want to call and the screen lights up and makes the call with simple mind controls. In this instance, one could imagine using advanced headphones in the future that are able to mask some of the sensors associated with such an application. Also note that Einstein's image is positioned center-left while Steve Jobs image is positioned center-right.  



At the end of the day


At the end of the day, we couldn't even conceive of using a multi-touch interface on smartphones or a tablet just five years ago. You would have been laughed at by the mere mention of such a crazy notion. It's pretty well understood that society in general has a way of not being able to see beyond their collective noses, let alone something five years away. That's what makes these future interfaces so fascinating. We may easily mock that which we can't conceive but that's not going to stop progress. No – to the Crazy Ones – anything is possible. The pioneering spirit pushes us forward even when others want to remain with the status quo.


The time frame that Intel's Mooly Eden provided for some of these next generation interfaces put Apple's own 3D head tracking OS research into better perspective. Ideas and concepts as revolutionary as these take time. But they're coming, whether we detest, resist or deny their inevitable arrival. It's progress and that's never going to change.


Hold on to your seats, because in the next five to ten years, we'll definitely look back and laugh at the way we used to use our computers. You could bank on it.


TX - Line 

Extra Graphic


1BB  - Cover - next gen interfaces give new meaning to mind control 




The comments to this entry are closed.