Facebook reveals initial work on wrist-based input devices for future AR Glasses & plans to reveal their work on Haptic Gloves later this year
Last week Patently Apple posted two reports (01 & 02) about Facebook's CEO and team talking about their upcoming move into delivering their first generation AR glasses. Mark Zuckerberg dreamed aloud about AR Teleporting that's likely to become a reality within a decade's time and his team spoke about AR possibly being the holy grail of social media experiences.
Today, Facebook is sharing some nearer-term AR Glasses research: wrist-based input combined with usable but limited contextualized AI, which dynamically adapts to you and your environment. Later this year, we’ll address some groundbreaking work in soft robotics to build comfortable, all-day wearable devices and give an update on our haptic glove research.
At Facebook Reality Labs (FRL) Research, they're building an interface for AR that won’t force us to choose between interacting with our devices and the world around us. We’re developing natural, intuitive ways to interact with always-available AR glasses because we believe this will transform the way we connect with people near and far.
The future of human-computer interaction demands an exceptionally easy-to-use, reliable and private interface that lets us remain completely present in the real world at all times. That interface will require many innovations in order to become the primary way we interact with the digital world.
Two of the most critical elements are contextually-aware AI that understands your commands and actions as well as the context and environment around you, and technology to let you communicate with the system effortlessly — an approach we call ultra-low-friction input.
The AI will make deep inferences about what information you might need or things you might want to do in various contexts, based on an understanding of you and your surroundings, and will present you with a tailored set of choices.
The input will make selecting a choice effortless — using it will be as easy as clicking a virtual, always-available button through a slight movement of your finger. But this system is many years off.
Wrist-Based Input with Limited Contextualized AI
Today, Facebook is taking a closer look at a version of AR interface that may be possible much sooner: wrist-based input combined with usable but limited contextualized AI, which dynamically adapts to you and your environment.
Facebook's Neural Interfaces: Typing Demo
Facebook EMG Demo: Controlling Virtual Objects
Haptics in focus
Facebook further notes that while ultra-low-friction input like a finger click or microgestures will enable us to interact with adaptive interfaces, we also need a way to close the feedback loop — letting the system communicate back to the user and making virtual objects feel tangible. That’s where haptics come into play.
Statement from FRL Research Science Director Sean Keller: "From your first grasp at birth all the way to dexterous manipulation of objects and typing on a keyboard, there’s this really rich feedback loop, where you see and do things with your hands and fingers and then you feel sensations coming back as you interact with the world. We’ve evolved to leverage those haptic signals to learn about the world. It’s haptics that lets us use tools and fine control. From a surgeon using a scalpel to a concert pianist feeling the edges of the keys — it all depends on haptics. With a wristband, it’s the beginning. We can’t reproduce every sensation in the virtual world you might feel when interacting with a real object in the real world, but we’re starting to produce a lot of them."
Take a virtual bow and arrow. With wrist-based haptics, we’re able to approximate the sensation of pulling back the string of a bow in order to give you confidence that you’re performing the action correctly.
(Click on image to Enlarge)
You might feel a series of vibrations and pulses to alert you when you received an email marked “urgent,” while a normal email might have a single pulse or no haptic feedback at all, depending on your preferences. When a phone call comes in, a custom piece of haptic feedback on the wrist could let you know who’s calling. This would then let you complete an action — in this case, an intelligent click to either pick up the call or send it to voicemail — with little or no visual feedback. These are all examples of haptic feedback helping HCI become a two-way conversation between you and your devices.
FRL Research Science Manager Nicholas Colonnese: "Haptics might also be able to convey different emotions — we call this haptic emojis. If you’re in the right context, different types of haptic feedback could correspond to popular emojis. This could be a new playful way for better social communication."
We’re currently building a series of research prototypes meant to help us learn about wristband haptics. One prototype is called “Bellowband,” a soft and lightweight wristband named for the eight pneumatic bellows placed around the wrist. The air within the bellows can be controlled to render pressure and vibration in complex patterns in space and time. This is an early research prototype helping us determine the types of haptic feedback worthy of further exploration.
Contrary to Apple and other tech companies that keep their major projects as secret as possible, barring patents, Facebook has decided to use elaborate theoretical video and graphics about concepts that may or never come to life so as to influence consumers that they're to be the true leaders of future AR Glasses.
In the big picture, Facebook wants to be in the big boys league with Apple and Google owning a platform with closely knit hardware to make it happen. With a 2 billion user customer base, it would seem inevitable at some point in time.