Today the UK Government Proposed a Tough new Internet Safety Law against Social Media Companies
CNBC Interviews Morgan Stanley's Katy Huberty on Apple's Move into Healthcare Supported by Recent Apple Patent

Apple Advances Gaze Controls as a Key Input Function along with Touch & In-Air Gesturing to Control a Headset

1 X Cover - Apple's Mixed Reality Headset advances Eye Gaze Technology  Patently Apple IP report Apr 8  2019.

 

Last week the European Patent Office published a patent application from Apple that relates to user interfaces for interacting with an electronic device, and more specifically to interacting with a Head Mounted Display (HMD) device using eye gaze techniques. Apple's HMD system will also support interpreting in-air hand gesturing as another input option.

 

Apple's invention describes techniques for interacting with a HMD device using an eye gaze. According to some embodiments, a user uses their eyes to interact with user interface objects displayed on the electronic device display. The techniques provide a more natural and efficient interface by, in some exemplary embodiments, allowing a user to operate the device using primarily eye gazes and eye gestures (e.g., eye movement, blinks, and stares).

 

The techniques can be applied to conventional user interfaces on devices such as desktop computers, laptops, tablets, and smartphones. The techniques are also advantageous for virtual reality, augmented reality, and mixed reality devices and applications.

 

The patent notes that some of the technology is powered by an iPhone, iPad or MacBook while other technologies are powered by the HMD.

 

In some embodiments, HMD system #100 includes touch-sensitive surface(s) for receiving user inputs, such as tap inputs and swipe inputs. In some embodiments, the display(s) and touch-sensitive surface(s) form touch-sensitive display(s).

 

The HMD system includes image sensor(s), which optionally include one or more visible light image sensor, such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical objects from the real environment. 

 

Image sensor(s) also optionally include one or more infrared (IR) sensor(s), such as a passive IR sensor or an active IR sensor, for detecting infrared light from the real environment. For example, an active IR sensor includes an IR emitter, such as an IR dot emitter, for emitting infrared light into the real environment. Image sensor(s) 108 also optionally include one or more event camera(s) configured to capture movement of physical objects in the real environment.  

 

Image sensor(s) also optionally include one or more depth sensor(s) configured to detect the distance of physical objects from the HMD system. In some embodiments, the system uses CCD sensors, event cameras, and depth sensors in combination to detect the physical environment around the system.

 

In some embodiments, the image sensor(s) include a first image sensor and a second image sensor. The first image sensor and the second image sensor are optionally configured to capture images of physical objects in the real environment from two distinct perspectives. 

 

In some embodiments, the system uses image sensor(s) to receive user inputs, such as hand gestures. In some embodiments, the system uses image sensor(s) to detect the position and orientation of the system and/or display(s) in the real environment. For example, the system uses image sensor(s) to track the position and orientation of the display(s) relative to one or more fixed objects in the real environment.

 

Apple's patent FIG. 2 below depicts a top view of the user (#200) whose gaze is focused on object #210.  The user's gaze is defined by the visual axes of each of the user's eyes.  The direction of the visual axes defines the user's gaze direction, and the distance at which the axes converge defines the gaze depth.

 

The gaze direction can also be referred to as the gaze vector or line-of-sight.  In FIG. 2, the gaze direction is in the direction of object #210 and the gaze depth is the distance d, relative to the user.

 

2 FIGS 1H AND 2 gaze control HMD invention  patent  Patently Apple IP Report

 

In some embodiments, the center of the user's cornea, the center of the user's pupil, and/or the center of rotation of the user's eyeball are determined to determine the position of the visual axis of the user's eye, and can therefore be used to determine the user's gaze direction and/or gaze depth.    

 

In some embodiments, gaze depth is determined based on a point of convergence of the visual axes of the user's eyes (or a location of minimum distance between the visual axes of the user's eyes) or some other measurement of the focus of a user's eye(s). Optionally, the gaze depth is used to estimate the distance at which the user's eyes are focused.

 

In Apple's patent FIG. 9 below, the radius of the cylinder surrounding gaze position #908 represents the angular resolution of the gaze direction, and the length of the cylinder represents the depth resolution of the gaze depth (e.g., the uncertainty in the gaze depth). Based on the gaze direction, angular resolution, gaze depth, and depth resolution, the HMD determines whether object #904 and/or object #906 correspond to the gaze position. 

 

In Apple's patent FIG. 10 below, object # 906 is enhanced by making the object brighter relative to object 904 (e.g., by increasing the brightness of affordance 906, decreasing the brightness of affordance 904, or a combination of both).

 

In some embodiments, enhancing an object includes altering the visual appearance of the object itself (e.g., by making the object brighter or changing the color of the object).

 

In some embodiments, enhancing an object includes degrading the visual appearance of other aspects of an environment (e.g., by making another object or the surrounding environments appear blurry).  Similarly, in a 2D representation of a 3D environment, a smaller object or an object that has a greater depth value in the 3D environment is optionally enhanced.

 

3 Apple patent figs 9  10  11 & 12  HMD gaze controls  Patently Apple IP report Apr 8  2019

 

Apple's patent FIG. 11 above depicts an embodiment in which object #904 and object #906 are enlarged (e.g., moved closer to the user) while maintaining their relative size and position;

 

Apple's FIG. 12 above depicts an embodiment in which object #904 and object #906 are enlarged and re-positioned relative to each other such that the objects are displayed side by side at the same depth.

 

Apple's patent application that was published last week by the European Patent Office was originally filed back in Q3 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

 

Two of the three inventors listed on the patent application include Michael Kuhn, Software Development Manager who came to Apple via Metaio who seems to be behind many of Apple's HMD device patents. The second inventor is Justin Stoyles who was Sr. Engineering Program Manager (ARKit Augmented Reality, Animoji and Memoji) and has recently left to work at Ruku Inc.

 

The Industry trend of developing head mounted displays is hot as far as Intellectual Property goes. While this report makes it the eleventh HMD related patent published since March of this year, a new Google patent filing surfaced this month for an advanced HMD that also uses gaze as a vital technology as well as noted below.

 

4 Google patent figs 4b and 9b HMD  Part of Patently Apple IP report April 8  2019

 

Google's patent notes that "the detected position and orientation of the HMD may allow the system to detect and track the user's head gaze direction and movement.

 

In some implementations, the HMD may also include a gaze tracking device to detect and track an eye gaze of the user. The gaze tracking device may include, for example, one or more image sensors positioned to capture images of the user's eyes. These images may be used, for example, to detect and track direction and movement of the user's pupils.

 

In some implementations, the HMD may be configured so that the detected gaze is processed as a user input to be translated into a corresponding interaction in the AR experience. The full Patently Mobile report on Google's HMD could be reviewed in full here

 

10.51FX - Patent Application Bar

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.

Comments

The comments to this entry are closed.