New Apple patents cover Eye-Tracking, In-Air Gesture Commands and a way to capture a user's Tattoo as an Apple Watch Face
Late last month the US Patent & Trademark Office published two patent applications from Apple relating to eye-tracking. In one patent, Apple describes the use of in-air hand gesturing as a means of controlling activity on a display for a Mac and/or future headset. The second patent is the one that caught my eye this past weekend as it was so odd. It covered a possible next-gen Apple Watch or new wrist band device that could capture a user's wrist, arm, ankle tattoo and when the watch covers the tattoo, the display will not only present the tattoo on the face of the watch but also enhance it with added color or in animation mode as an option.
If a future Apple Watch, users that happen to have a small tattoo on their wrist won't have to cover it up because the watch will capture the design and then reproduce it on the watch interface in whole or in part. Users will be able to add color or an animation. And, should the user's tattoo reside on the side of their arm or anterior part, no problem. After the watch captures the tattoo, it could reset it so it resides on the face of the watch in its natural top wrist position as captured in Apple's patent FIG. 7E below.
As the user removes their watch or wrist device, the tattoo goes into blur mode of the tattoo and then shuts off as noted in FIG. 7J above.
That part of the invention is rather cool if you have tattoos. Then the patent gets a little strange, though interesting.
The principle of the patent isn't restricted to tattoos for a watch. For instance, if a user puts on Apple's future mixed reality headset, the internal camera(s) could capture the user's eyes, nose, eyebrows, hair, glasses, freckles and other facial features and present them on the face of the device. It does nothing for the user of course, but for those in their environment, they'll be able to distinguish who has the headset on. Once you can visualize that scenario, it become rather ingenious. This would allow a user to walk on the street as well with their headset and the public would be able to see the rest of your face behind your headset face plate.
Apple's patent states: "Updating a display to show a portion of a user's body that is hidden behind the electronic device provides a more intuitive way for the user, and others in the surrounding environment who view the display, to know where the electronic device is located relative to the user's body, even when that portion of the user's body is covered and not visible to the user (or others in the surrounding environment). This improves the experience of interacting with wearable devices by decreasing the cognitive burden on the user, and others in the surrounding environment, to imagine or guess what is behind the electronic device."
For those wanting to dive further into the details of this invention, review Apple's patent application 20220100271.
Methods for Navigating User Interfaces with In-Air Gesture Commands
In this second patent, Apple describes using eye-tracking and/or gaze tracking in sync with in-air hand gestures that could include pinch, scrolling, tapping and other motions to control a user interface for a Mac, iDevice or future mixed reality headset where it could really be beneficial.
Apple specifically describes movement of the user's eyes and hand in space relative to the GUI or the user's body as captured by cameras and other movement sensors, and voice inputs as captured by one or more audio input devices.
In some embodiments, the functions performed through the interactions optionally include image editing, drawing, presenting, word processing, spreadsheet making, game playing, telephoning, video conferencing, e-mailing, instant messaging, workout support, digital photographing, digital videoing, web browsing, digital music playing, note taking, and/or digital video playing."
Apple's patent FIG. 4 below illustrates a block diagram illustrating a hand tracking unit of a computer system that is configured to capture in-air gesture inputs of the user; FIG. 5 is a block diagram illustrating an eye tracking unit of a computer system that is configured to capture gaze inputs of the user; and FIG.7A illustrates the electronic device detecting the user's gaze and via animation shows what the gaze has chosen to activate, move or delete on a user interface.
As always, Apple's patent application (number 20220100270) is a detailed patent that you could review in full here.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.