IDF Keynote Elevated 'Sensification' Computing via RealSense, a competitor to Apple's PrimeSense Technology
Intel's IDF developer conference is in full swing this week and the opening keynote video by CEO Brian Krzanich was delayed until mid-morning. The sense that I generally got from the keynote was that Intel is shifting gears and is about to cover the computing industry in a whole new dimension covering the internet of things, wearables and technologies far from just processors. In fact, the most in-depth segment of their opening keynote revolved around their RealSense 3D camera technology; a technology or family of technologies that Apple possesses via their PrimeSense acquisition. The more Krzanich spoke of the expansion of RealSense technology, the more I wondered how Apple would utilize their own 3D-depth sensing technology that we've been reading about so much of late in patent reports. Intel's keynote touched on a lot of technology that many will appreciate and we've done our best to highlight the best parts in this report.
After playing a promotional introductory developer centric video, Intel's CEO Brian Krzanich stepped on stage and began his keynote. According to Krzanich, "There's never been a better time to be a developer. One thing is clear. Computing is everywhere. It's everywhere in our lives today. It's in our bags, our clothes, our homes, our cars and just about everything we do, computing is there with us. And what is changing is that computing and the computing experience is becoming personalized. And from this assumption, we've developed three assumptions about the future of computing. These three assumptions drive our innovation at Intel."
The Three Assumptions
The theme of Krzanich's keynote was about 3 Key Assumptions of Computing. Intel's CEO stated that "The first assumption driving our innovation is that as computing becomes ultra-personal you want that computing to have sight, sound and touch. And all of that sensing becomes more critical. We call that the 'sensification' of compute. Secondly, as computing gains these senses it opens up a huge number of opportunities and applications, which brings us to our second assumption.
Krzanich noted that the "second assumption was about the opportunity for everything to become smart and connected." I discovered later in the keynote that this assumption centered on Intel's IoT Platform which we're not going to cover in this report as some of the key technology behind that is contained within a secondary keynote not yet reviewed. In that keynote Intel will reportedly reveal some interesting information about the coming 5G wireless revolution.
Krzanich noted that "the third assumption was about computing becoming an extension of you. Our physical environment comes alive with wearables and other technology that brings it to be a part of you." It's a segment not covered in this report as it was limited in scope.
I found that the first assumption was the most interesting of the three presented today and it's the focus of this report.
Sensification of Compute
In this segment of Krzanich's keynote he noted that "Computing used to be defined by really a two dimensional world. You had the keyboard, a mouse … and maybe if you were more advanced you'd have a touchscreen. But that's not enough in today's world. Now it's expanding to include more human-like senses providing new and immersive experiences for users. Let's start with the first example of those senses, and that would be sound."
After a number of common computer sounds were played over the speakers in the auditorium, Krzanich continued: "So all of us know what these sounds are. They're the sounds of computing. But those sounds are one way: they come at us. They come from the PC, they come from the device. Whatever the computing device is the sound comes from it. But we want more in the future.
We want our devices to behave more like humans. We want to have them always listening and always responding in a natural and realtime way. This means that a voice controlled experience shouldn't start by having to push a button. It shouldn't require the device be plugged in or that it's connected to the cloud.
Talking to a device should feel like a two-way conversation like we all have every day of our lives. But until now, that is not how most of our experiences have been. Even devices that do listen have typically had to be plugged in. So we went out and worked on a new technology: Intel smart sound technology – and it completely changes that experience.
It's one of the most significant breakthroughs in digital audio we have seen for some time." Krzanich then went into demo mode on stage with Intel's new 'Wake on Voice.' The demo was in connection with Windows 10 being used on a PC powered by Intel's 6th generation processor dubbed Skylake using Windows new digital assistant feature called Cortana (which is Siri-like). Users will be able to just talk at any time to Cortana to wake up a notebook or desktop, to play music that you want, look up a file or photograph and over time go further much deeper.
Parallels Desktop 11 for Mac reportedly offers this new feature. Whether Cortana will only work with OS X El Capitan on systems with Skylake processors is unknown at this time.
Intel Introduced Breakthrough Realtime Audio for Android Tablets … Equal to what the iPad had for Years
Another audio project that Intel worked on was bringing realtime audio to Google tablets. Sounds, like when pressing a key on a keyboard should always be in realtime but isn't today. Krzanich said, "it should be simple right? Well, audio application developers will tell you that it really isn't that simple."
Krzanich then demonstrated an Android tablet with a keyboard app on it that could now play the audio in realtime when tapping on a key. Wait a minute; Apple users have been able to do that on an iPad for years now with GarageBand. That's far from being an audio breakthrough, that's just catching up to Apple's A-Series processors and superior iOS.
Intel's RealSense Camera
The next sense that was discussed involved Intel's RealSense Technology. We first covered Intel's RealSense technology before it was branded as such back in January 2014. More recently we covered Krzanich revealing that a RealSense 3D camera would be coming to a smartphone this fall back in April during IDF China.
Yesterday Krzanich revealed that at least one of the smartphones that will be using their RealSense camera this fall or in 2015 will be smartphones based on Googles Project Tango.
According to Krzanich, "The RealSense Camera harnesses the power of human sight on a computing device. It not only creates a great image, but it measures the depth and relative position of all the objects in its field of view. And it has the capabilities of a human eye." Krzanich added: "Why do we need depth sensing on a smartphone? RealSense breaks the limits of what a phone can do and there's no better example for this than combining Intel's RealSense with Google's project Tango."
Google was actually further ahead with Project Tango 16 months ago when their 3D camera technology was from PrimeSense, a company that Apple has since acquired. There's a schematic of the PrimeSense technology being used in Project Tango noted below from a report that we posted here .
Krzanich further noted that the RealSense camera on smartphones will allow users to scan objects in 3D. The images could then be sent to 3D printers. And with the RealSense camera using Google's new "Meshing" app in Project Tango based smartphones, developers and users will be able to scan entire rooms and turn the images into virtual reality environments.
Krzanich went on to further demonstrate how future robots will use RealSense cameras for its vision by presenting a video introducing 'Relay" a robot by Savioke; A robot that will be used in hotels to deliver items to guests from the front desk or kitchen, according to Mr. Cousins, CEO of Savioke.
While the technology and robot have been in the press for close to a year now, Mr. Cousins revealed in Intel's video for IDF that their robots are now using Intel's RealSense camera technology to ensure that the robots are able to safely navigate through hotel hallways and enter and exit elevators without bumping into items or more importantly staff and/or guests.
Shifting gears a bit, Krzanich noted that under the hood of the Relay robot is an operating system called ROS. It's considered the de facto OS for the robot maker community. RealSense will now support ROS. More importantly, RealSense will support Mac OS X, Linux, Android and other operating systems.
The last segment on sensification was about video gaming.
Intel presented a gaming simulator from VRX as noted above. The rig is driven by Intel's Skylake Processor(s) 6700K powering 3 high definition displays with screaming frame rates (three 4K displays as we reported way back in April). There's a RealSense camera is noted above hanging over the seam between the displays in the photo that tracks his head movements in realtime that gives the game that extra level of immersion.
As Intel showcases the value of RealSense cameras, one could only imagine what Apple has in store for future devices using future PrimeSense's advanced 3D-Depth cameras.
The last point of the keynote was about Intel's revolutionary new 3D XPoint memory that Intel and Micron introduced back on July 29. Firstly, the name of the technology when it comes to market will be known as Optane "based on 3D XPoint."
Optane that comes to market next year will be available for high end servers, work stations like the Mac Pro and right through to ultrabooks or ultraportables which include the MacBook Air and new MacBook.
Considering that the IDF's opening keynote by Krzanich spent so much time focused on the 'sensification' of computing, it only stands to reason that their revolutionary Optane memory is going to be what's needed to power future sensory packed computer features coming to both Windows and OS X computers in the years ahead. After hearing this keynote, something tells me that my workstation upgrade should be delayed another year – but that's another story for another day.