Apple Wins Patent for Advanced 3D Eye/Head Tracking System Supporting Apple's 3D Camera
The U.S. Patent and Trademark Office officially published a series of 46 newly granted patents for Apple Inc. today. In this particular report we cover another interesting 3D related invention that could come to life once Apple's 3D camera comes to market next month with the iPhone 7. Apple's granted patent covers an advanced eye tracking and/or head tracking system that supports a camera that allows users to look around an image to get different visual perspectives. Apple has been working on such technology since 2009. Our cover graphic is from another of Apple's earlier inventions regarding head tracking.
Apple's newly granted patent covers their invention relating to rendering dynamic three-dimensional imagery, and more particularly to a system and method for rendering dynamic three-dimensional appearing imagery on a two-dimensional user interface screen.
Once Apple introduces their new iPhone 7 with a dual lens 3D camera, it'll only be a matter of time until it also works its way into Macs. Apple will be able to introduce apps that take advantage of head and/or advanced eye-tracking so that users will be able to look at photo's face with one perspective and then move their head to look around an image to see another aspect of the 3D image.
Apple's granted patent is titled "System and method for rendering dynamic three-dimensional appearing imagery on a two-dimensional user interface." For the record, this invention was first filed for in September 2008 or eight years ago.
Apple was granted their first patent for this invention in October 2014. In today's granted patent we're able to see that Apple has refined their patent claims describing the invention a slightly different way.
In one example of the technology described in the patent, Apple notes that "Infrared (IR) data can be inverted, and then the pupils will stand out as very bright circles. The sensing can be done by a camera or other image sensing device. Camera based eye trackers typically use the corneal reflection (also known as the first Purkinje image) and the center of the pupil as features to track over time.
A more sensitive type of eye tracker, the dual-Purkinje eye tracker, uses reflections from the front of the cornea (first Purkinje image) and the back of the lens (the fourth Purkinje image) as features to track.
An even more sensitive method of eye tracking is to sense image features within the eye, such as the retinal blood vessels, and follow these features as the eye rotates. However, any method for tracking head position and eye position of a person using a personal computing device is contemplated as within the scope of this invention.
Apple's patent FIG. 4 FIG. presented above illustrates another method embodiment for rendering three-dimensional appearing imagery. The system determines if the camera and IR (infrared emitters) are active (@402). If no, the system disables the rendering of three-dimensional appearing imagery (@412) and displays two-dimensional appearing imagery.
If the camera and IR are active the system determines if there is a likely target (@404). If no, the system disables the rendering of three-dimensional appearing imagery (@412) and displays two-dimensional appearing imagery.
If there is a likely target, the system determines if the naked eye HPT (head position target) analyzer is able to determine the user's eyes relative position in comparison to a camera (@406). If no, an alternate signature target can be used and calculated (@408). This failure to determine the user's eyes relative position in comparison to a camera can occur when the user's pupils are obscured, (For example if the user is wearing reflective glasses). The system will usually be able to continue rendering the three-dimensional appearing imagery based on this alternate signature target found using alternate HPT analyzer (@408).
If the naked eye HPT (head position target) analyzer is able to determine the user's eyes relative position in comparison to a camera, the system constructs a virtual scene (@410). The system then places a virtual camera (@414). This determines the point of view that should be displayed to a user to render the three-dimensional appearing imagery. The system then renders a scene (@416). The system then determines if the HPT (head position target) has been lost (@418). If the HPT has not been lost, the system refreshes and goes back for another cycle of the naked eye HPT analyzer.
Apple's granted patent 9,423,873 was originally filed in Q3 2008. It was refiled as a continuation patent in September 2014 and published today by the US Patent and Trademark Office. To review this more in-depth, check out Apple's granted patent here.
Patently Apple presents only a brief summary of granted patents with associated graphics for journalistic news purposes as each Granted Patent is revealed by the U.S. Patent & Trademark Office. Readers are cautioned that the full text of any Granted Patent should be read in its entirety for full details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments.
Comments