A European patent from Apple reveals aspects of Apple Vision Pro's EyeSight feature and how Siri may interact with the headset user
On August fifth Patently Apple posted a trademark report covering Apple Vison's "EyeSight" feature. Today, Patently Apple discovered a patent Apple filed in Europe that was published on August tenth describing the fundamentals of EyeSight. The patent also covers provision for real-time social intelligence and more.
One aspect of Apple's patent covers a method that determines whether a user wearing a headset is in an engaged state based on at least one of: detecting a human in a near-field scene of the user, determining that the user is gazing at the human, and detecting a speech input from at least one of the user or the human while the user is gazing at the human. In response to determining that the user is in the engaged state, the method includes foregoing providing the outputs to the user while the user is in the engaged state.
In another example method, while providing one or more outputs to a user of the electronic device, the method includes determining whether the user is in an engaged state based on: detecting a human in a near-field scene of the user, determining that the user is gazing at the human, and detecting a speech input from at least one of the user or the human while the user is gazing at the human. In response to determining that the user is in the engaged state, the method includes foregoing providing the one or more outputs to the user while the user is in the engaged state.
Apple's patent FIG. 9 below illustrates a system for determining a real-time social state for a user wearing Apple Vision (HMD). In some examples, upon satisfying both conditions 1) a speech input or utterance is received and 2) the user is gazing or making eye-contact with another human. Accordingly, if the score satisfies a predetermined threshold, the state machine may make transition to an engaged state #930 from disengaged state #920. Technically, this is what triggers EyeSight to kick in and allow the nearby person to be able to see your eyes or created image of your eyes.
Apple's patent FIG. 10B above illustrates an example use of the social engagement system on an extended reality device while a user is socially engaging with a human.
Another aspect of the patent relates to how Siri could interact with Apple Vision Pro user. In the examples below in FIGS. 11A and 11B, we see that Siri could sent text messages, notifications directly to the display of the users as noted below.
In FIG. 11B we see that Siri's icon is present. In some examples, while the digital assistant icon #1102 is displayed on the screen or display of the extended reality device worn by the user, a human #1106 may be detected in a near field scene of the user. Upon detection of a human in a near-field scene, a square or rectangle (#1104) surrounding the human may be displayed on the screen, as shown in FIG. 11B.
In addition to detecting a human in the near-field scene, the social engagement system implemented on the extended reality device may determine that the user is making eye-contact towards the human. To show that the user is making an eye contact with the human, real-time eye movement of the human and/or user may be displayed on the screen of the extended reality device, as shown in FIG. 11B.
For more details, review Apple's patent application number WO2023150303 published in Europe August tenth.
Although we know that Apple has 5,000+ patents protecting Apple Vision Pro, it's a little difficult to know what patents relates to the current or future versions of Apple's XR headset / Spatial Computer.
Today's patent presents the rudimentary aspects of EyeSight and we're bound to learn more over time, because every feature is likely to be backed by several patents and not a single one so as to better protect their invention.
Comments