Apple Reminds Current and Future Customers just how high a bar they set to Protect their Online Privacy
Apple may Shift Apple Watch Antennas to the Band while adding Touch ID to the Display

A new Apple Patent Focuses on Optical Modules for a Future Head-Mounted Display

1 Cover HMD with Optical Module

 

Today the US Patent & Trademark Office published a patent application from Apple that relates to an optical module and head-mounted displays having the optical module.

 

Display systems may include a head-mounted display unit (HMD). The head-mounted display may include one or more displays (e.g., screens) that display digital images to a user wearing the head-mounted display and one or more corresponding lenses through which the user views the digital images.

 

The digital images may include a scene having both a foreground and a background features, such as a person in the foreground and a landmark in the background. To simulate changing focal distances as the user looks between the foreground and the background features of the digital image, a distance may be changed between the one or more displays and the one or more lenses of the system.

 

In the bigger picture, embodiments of the head-mounted displays (HMD) and display units illustrated further below are for use in displays systems, such as those used for computer-generated reality (e.g., virtual reality or mixed reality).

 

The display unit includes a display, a lens, and a movement mechanism that moves the display and the lens relative to each other. Each display unit further includes a chamber that is defined between the display and the lens.

 

The chamber is sealed to prevent or hinder intrusion of debris (e.g., dust, moisture droplets, etc.) from entering the chamber and, thereby, interfering with the user's view of digital images on the display.

 

As the display and the lens move relative to each other, the volume of the chamber changes, such that pressure within the chamber changes. Such changes in volume and pressure are accounted for by one or more of vents, materials, or mechanisms in communication with the chamber, which relieve (e.g., hinder) pressure changes and, thereby, may allow for a less powerful movement mechanism and, thereby, lighter and/or more compact display units.

 

Technically, Apple's invention covers implementations of an optical module and head-mounted displays having the optical module.

 

In one aspect, an optical module for a display system includes a lens, a display screen, a movement mechanism, and one or more pressure-relieving features.

 

In one aspect, a head-mounted display includes a housing, a support, a sensor, and an optical module. The support is coupled to the housing for supporting the housing on a head of a user.

 

The sensor measures a parameter of an eye of the user. The optical module includes a lens, a display screen, a movement mechanism, and one or more of an adsorbent material or a passive radiator.

 

The one or more of the adsorbent material of the passive radiator are in fluid communication with the chamber to hinder changes of pressure in the chamber as the lens and the display are moved relative to each other.

 

A zeolite or the flexible membrane passively relieves air pressure on the display. The passive pressure-relieving feature hinders pressure changes in the chamber.

 

Apple's patent FIG. 1A below is a side view of a head-mounted display of a display system having hidden components depicted in broken lines; FIG. 1B is a top view of the head-mounted display of FIG. 1A. The head-mounted display #100 may be considered as or part of a computer-generated reality system (e.g., virtual or mixed reality system).

 

2 x HMD system

 

More specifically, the head-mounted display is configured to display images for computer-generated reality with the optical modules #130. For example, the head-mounted display may include a controller #122 and sensors #124 as depicted above.

 

The sensors may detect various parameters related to the head-mounted display and/or the user. For example, the sensors may measure the position, orientation, and/or changes within the head-mounted display and thereby, of the head "H" of the user. The sensors may also include a right eye sensor #124r and a left eye sensor #124l that measure one or more parameters associated with the right eye and the left eye, respectively, of the user, such as gaze direction or focal characteristic.

 

The controller #122, based on sensor information received from the sensors sends image signals to the optical modules #130 according to which the optical modules display images. For example, the sensors may detect a change in orientation of the head-mounted display and thereby the head of the user (e.g., moving leftward), and the controller sends image signals to the optical modules for displaying images panning appropriately within the computer-generated reality environment (e.g., panning leftward).

 

In Apple's patent FIGS. 2A-2D below we're able to see each of the optical modules #130 includes a display screen #232, a lens #234, a housing #236, and a movement mechanism #238 that moves the display screen and the lens relative to each other to change the screen-to-lens distance there-between.

 

3 XXX - Apple HMD patent figs 2a-2d

 

Apple notes that a physical environment refers to a physical world that people can sense and/or interact with without aid of electronic systems. Physical environments, such as a physical park, include physical articles, such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment, such as through sight, touch, hearing, taste, and smell.

 

In contrast, a computer-generated reality (CGR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic system. In CGR, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner that comports with at least one law of physics.

 

For example, a CGR system may detect a person's head turning and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), adjustments to characteristic(s) of virtual object(s) in a CGR environment may be made in response to representations of physical motions (e.g., vocal commands).

 

A person may sense and/or interact with a CGR object using any one of their senses, including sight, sound, touch, taste, and smell. For example, a person may sense and/or interact with audio objects that create 3D or spatial audio environment that provides the perception of point audio sources in 3D space. In another example, audio objects may enable audio transparency, which selectively incorporates ambient sounds from the physical environment with or without computer-generated audio. In some CGR environments, a person may sense and/or interact only with audio objects.

 

Apple's patent application 20190339523 that was published today by the U.S. Patent Office was filed back in Q3 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

 

Apple's Inventors

 

Christopher Wilk: Acoustic Simulation Engineer

Neal Evans: Acoustic Design Engineer

Anthony Montevirgen: Product Design Engineer

James Vandyke: Product Design Engineer

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.