Apple wins a Patent relating to Micro-Gestures that were designed to Control the Apple Vision Pro's visionOS
Today, the U.S. Patent and Trademark Office officially granted Apple a patent that relates to micro-gestures of a user's hand that has also become a patent fulfilled when Apple introduced their Vision Pro back in June.
During the event, Alan Dye, VP, Human Interface stated: "Every major Apple platform was driven by an innovative new input model. Mac with the mouse. iPod with the Click Wheel. And iPhone with Multi-Touch. With Vision Pro, we set the ambitious goal to design an incredibly intuitive input model for spatial computing – on that could be used without controllers or additional hardware. Apple Vision Pro relies solely on your eyes, hands and voice. It's just you and your content. It's remarkable, and it feels like magic. You browse the system simply by looking. It's effortless. Simply tap your fingers together to select an icon and gently flick to scroll a page. And we designed every gesture to be as subtle and natural as possible, so you could keep your hands where they're most comfortable like resting in your lap, or on the sofa."
Apple's FIG. 7 illustrates a series of 'Micro-Gestures' now used with Vision Pro.
Apple further notes that in some embodiments, "A GUI menu is displayed in a mixed reality environment (e.g., floating in the air or overlaying a physical object in a three-dimensional environment, and corresponding to operations associated with the mixed reality environment or operations associated with the physical object)." This is course directly relates to the WWDC23 Apple Vision Pro presentation.
The patent goes on to describe how micro-gestures may also one day work with other Apple devices which they list generically though translate to Macs, iPad, iPhone or Apple Watch.
For more details, review Apple's granted patent 11768579. Apple began their work on this prior to their original filing back in 2019.
Comments