A Euro Patent published this week describes the work behind eliminating Motion Sickness when viewing content in Apple Vision Pro
This week the European Patent Office published an Apple that relates to some of the work behind eliminating motion sickness for the Apple Vision Pro. This week Apple described motion sickness being significantly reduced, if not eliminated for most users with the Apple Vision Pro. Apple achieved this by introducing their new R1 processor. Apple's patent provides us with some of the work likely behind the processor that magically makes this happen. Apple notes in patent point #86: "In some implementations, the method is performed on a processor."
Apple's invention covers devices, systems and methods that adjust image content to reduce motion sickness or otherwise improve a user experience. Some implementations adjust content outside of a foveated gaze zone (FGZ) also referred to herein as the extrafoveal zone (e.g., peripheral and parafoveal zones), to reduce motion sickness during an experience (e.g., visual and/or auditory content that could include a real-world physical environment, virtual content, or a combination of each).
In some implementations, unlike prior techniques that blacked out content outside the FGZ, the techniques adjust (e.g., reduce) contrast or spatial frequency of content outside the FGZ. This may reduce motion sickness without significantly detracting from the user experience in the way that blacking out content detracts from the user experience.
In some implementations, the techniques adjust an extrafoveal area outside of the FGZ by adding content associated with the three-dimensional (3D) space of the user's physical environment. For example, the added elements may be fixed with respect to the inertial world (e.g., real world physical environment) that the user is in (e.g., a room). As a result, the user may move relative to added elements in a computer-generated reality (CGR) environment and the motion that the user perceives (with respect to the added elements) matches the information from the user's vestibular system (e.g., vestibular cues). This matching of visual cues with vestibular cues may reduce motion sickness or otherwise improve the user experience.
In some implementations, a device provides an experience (e.g., a visual and/or auditory experience) to the user and obtains, with a sensor, physiological data (e.g., gaze characteristics) and motion data (e.g., controller moving the avatar, head movements, etc.) associated with a response of the user to the experience.
Based on the obtained physiological data, the techniques described in the patent can determine a user's vestibular cues during the experience (e.g., a CGR experience) by tracking the user's gaze characteristic(s) and other interactions (e.g., user movements in the physical environment, or moving within the CGR environment such as moving a virtual avatar with a controller).
Based on the vestibular cues, the techniques can adjust content (e.g., reduce contrast or spatial frequency) and/or add additional content (e.g., noise, structured patterns, etc.) to reduce motion sickness.
In general, one innovative aspect of the subject matter described in this patent can be embodied in methods that include the actions of, at an electronic device having a processor, determining a first zone and a second zone of a display, generating images of a three-dimensional (3D) environment based on the first zone and the second zone, identifying image content of each of the images corresponding to the second zone of the display, and reducing at least one of contrast or spatial frequency of the image content of each of the images corresponding to the second zone of the display.
Apple's patent Figure 2 below illustrates a pupil of the user and a diagram of a vision field of the pupil including the fovea, parafovea, and peripheral vision regions.
During the WWDC23 Keynote, Mike Rockwell noted that "Latency between sensors and displays can contribute to motion sickness or discomfort. Apple's R1 chip virtually eliminates lag, streaming new images to the displays within 12 milliseconds." Apple's patent Figure 3 below describes a flowchart representation of adjusting image content to reduce motion sickness.
Apple's patent Figure 7 above illustrates tracking movement of a user, providing inertial cues in three degrees of freedom (DoF) during a computer-generated reality (CGR) experience; Figure 11 illustrates tracking movement of a user, providing inertial cues in six DoF during a CGR experience.
Apple's patent Figure 15 below is a block diagram illustrating device components of an exemplary XR Headset; Figure 16 is a block diagram of an example XR head-mounted device.
Apple's patent Figure 13 above illustrates tracking movement of a user, providing inertial cues in six DoF during a CGR experience.
For more details, review Apple's European patent application number EP4189527 that was published on WIPO Wednesday, June 07 2023.
- Herman Damveld: Senior Research Lead (Technology Development Group
- Grant Mulliken: Senior Manager, Technology Development Group