A future Apple HMD (Vision Pro, Smartglasses, Contact Lenses) will allow a user to visualize non-visible phenomena like Gasses, Audible Fences+
Apple Card's Saving Account by Goldman Sachs has Reached over $10 Billion in Deposits

Apple has won a major patent for Vision Pro Displays that include Foveal and Peripheral Projectors that limit Motion Discomfort


Yesterday the U.S. Patent and Trademark Office officially granted Apple a patent that relates to addressing "accommodation-convergence mismatch problems" that could cause eyestrain, headaches, and/or nausea in some headsets. This is integrated into the new R1 chip. During the Apple Vision Pro's introduction, Mike Rockwell, VP, Technology Development Group stated that "Latency between sensors and displays can contribute to motion discomfort. Apple's new R1 processor virtually eliminates lag by streaming new images to the displays with 12 milliseconds. The R1 ensures that experiences feel like they're taking place right in front of your eyes."

Apple's 2016 invention was granted another patent yesterday. In it Apple explains that  in a stereoscopic system the images displayed to the user may trick the eye(s) into focusing at a far distance while an image is physically being displayed at a closer distance. In other words, the eyes may be attempting to focus on a different image plane or focal depth compared to the focal depth of the projected image, thereby leading to eyestrain and/or increasing mental stress.

Accommodation-convergence mismatch problems are undesirable and may distract users or otherwise detract from their enjoyment and endurance levels (i.e. tolerance) of virtual reality or augmented reality environments.

Apple's granted patent covers various embodiments of an augmented reality (AR), and/or mixed reality (MR) direct retinal projector system are described that may, for example, resolve the convergence-accommodation conflict in head-mounted AR, MR, and VR systems.

Embodiments of an AR headset (e.g., a helmet, goggles, or glasses) are described that may include or implement different techniques and components of the AR system.

In some embodiments, an AR headset may include a reflective holographic combiner to direct light from a projector light engine into the user's eye, while also transmitting light from the user's environment to thus provide an augmented view of reality.

In some embodiments, the holographic combiner may be recorded with a series of point to point holograms; one projection point interacts with multiple holograms to project light onto multiple eye box points. In some embodiments, the holograms are arranged so that neighboring eye box points are illuminated from different projection points.

 In some embodiments, the holographic combiner and light engine may be arranged to separately project light fields with different fields of view and resolution that optimize performance, system complexity and efficiency, so as to match the visual acuity of the eye. In some embodiments, the light engine may include foveal projectors that generally project wider diameter beams over a smaller central field of view, and peripheral projectors that generally project smaller diameter beams over a wider field of view.

In some embodiments, the light engine may include multiple independent light sources (e.g., laser diodes, LEDs, etc.) that can independently project from the different projection points, with a proportion being foveal projectors and a proportion being peripheral projectors.

In some embodiments, the light engine includes two or more two-axis scanning mirrors to scan the light sources; the light sources are appropriately modulated to generate the desired image.

In some embodiments, the light engine includes a series of optical waveguides with holographic or diffractive gratings that move the light from the light sources to generate beams at the appropriate angles and positions to illuminate the scanning mirrors; the light is then directed into additional optical waveguides with holographic film layers recorded with diffraction gratings to expand the projector aperture and to maneuver the light to the projection positions required by the holographic combiner.

In some embodiments, the light engine includes a lens for each projector to focus emitted light beams such that, once reflected off the holographic combiner, the light is substantially collimated again when it enters the subject's eye. The required focal surface may be complicated by the astigmatism of the holographic combiner, but is a curved surface in front of the combiner.

The ideal focal surface is different for different eye box positions, and errors may lead to less collimated output. However, in some embodiments, this can be compensated by reducing the beam diameter for different angles where the errors between the ideal focal surface and the actual best fit focal surface are greatest, which alleviates the problem by increasing the F-number and hence the depth of focus of the beam. In some embodiments, these features may be incorporated into a holographic lens.

In some embodiments, active beam focusing elements may be provided for each projection point. This may reduce or eliminate the need to change beam diameter with angle. This may also enable beams that diverge into the eye to, rather than being collimated, match the beam divergence of the supposed depth of the virtual object(s) being projected by the light engine.

With the methods and apparatus presented above, the AR system may not require extra moving parts or mechanically active elements to compensate for the eye changing position in the eye box or for the changing optical power from the holographic combiner during the scan, which simplifies the system architecture when compared to other direct retinal projector systems.

Apple's patent FIG. 5 below illustrates an augmented reality (AR) system that uses a reflective holographic combiner to direct light from a light engine into a subject's eye, while also transmitting light from the environment to the subject's eye.

2 Apple Vision Pro patent figs 5 and 6

Apple's patent FIG. 6 above illustrates an AR headset that includes a reflective holographic combiner to direct light from a light engine into a subject's eye, while also transmitting light from the environment to the subject's eye.

Apple's patent FIG. 7 below illustrates high-level components of an AR system;  FIG. 8 illustrates foveal and peripheral projectors of a light engine in an AR headset.

3 Apple patent figs 7 and 8

Apple's patent FIG. 9 below illustrates light beams from foveal projectors in an AR system;  FIG. 27 illustrates a best fit focus curve and a focusing element for foveal projections in an AR system.

4 Apple patent figs 9 and 27

Apple's patent FIG. 27 above illustrates a best fit focus curve and a focusing element for foveal projections in an AR system.

For more details, review Apple's granted patent 11714284 that is one of the 5,000 patents behind the Apple Vision Pro headset. Considering that this Apple's second granted patent for this invention, Apple has added an additional 20 technical points in their patent claims that are presented at the bottom of this patent.  In new patent claims #4, for instance, Apple notes that "The system of claim 1, wherein field of view of the foveal light beams is 20° horizontal ×20° vertical, and wherein field of view of the peripheral light beams is 120° horizontal ×74° vertical."

Some of the Team Members on this Apple Project

  • Alexander Shpunt: Architect. Shpunt came to Apple via the acquisition of PrimeSense where he was the Chief Technical Officer.  
  • Richard Topliss: Senior Camera Technology Specialist at Apple
  • Paul Gelsinger: Display Exploration Engineer
  • Gregory Thomas: Senior Area Manager
  • Richard Tsai: Sr. Camera Design Engineer


10.52FX - Granted Patent Bar


The comments to this entry are closed.