Apple Patent shows us that Apple's TrueDepth Camera for Accurate Facial Expression via Animoji is only the beginning
On Monday Patently Apple posted a report titled "Apple's PrimeSense Team Leader to deliver a Session at a Conference in Israel regarding Apple's TrueDepth Camera. It's one of the most advanced inventions for the iPhone that made Face ID and Animoji possible. If you look at some of the original patent figures covering the breakthrough 3D depth camera, like those in this report that show it in context with a gaming device, you would have never dreamt that it would end up miniaturized in the iPhone X's notch.
So when you see today's patent figure (above or below) you have to understand that it's only to be used to understand a rudimentary fact that in-air gesturing is for real and likely to surface as one of the next levels for their TrueDepth camera.
Now that the camera has proven to accurately interpret human facial expressions, then understanding that in-air hand or finger gestures used to control on-screen or TV elements is logically one of the next phases for the TrueDepth camera.
So even though the patent figure looks ancient, it's the technology that matters and the end result in this case.
Apple's PrimeSense team in Israel has been working on in-air gesturing for years with Microsoft that used their technology for Xbox-TV and exercising games etc. It's just a fact that it works and by now it's probably super refined.
Apple's patent FIG. 6 presented above is a view of portions of a system operating under remote control of a user. In this particular figure, a user is controlling an alphabet arc to spell out the word 'Invention," by only using in-air gestures to choose each letter. Apple refers to in-air gesturing officially as "remote input."
Of course today you could simply ask Siri for the title of a movie and it's done. So this particular example is now outdated. However, underlying concept isn't outdated that a user could one day soon control a part or all of a future interface, be it for a specific app or parts of macOS or iOS.
While the user will be able to use 'remote input' to make in-air gestures to zoom in or out of a photo or element on the screen, Apple notes that "Remote input may be provided for interaction with a remote device such as a gaming console, an interactive television, a computerized cellular phone, or a computer."
Apple further notes that "In the context of the present application and claims, the term 'remote device' herein refers to any remotely governable device containing a processing unit. A sensing device may be used to detect a virtual control, such as a virtual keyboard."
Elsewhere Apple notes that the sensing device is typically a three-dimensional camera that detects information that includes the position of a body (or at least parts of the body) of the user or other tangible entities wielded or operated by the user for interacting with a computer application running on the remote device, all of which are sometimes referred to herein for convenience as 'control entities.'
The sensing device detects the presence and changes of position of a control entity, i.e. its speed and direction. The remote device interprets movements detected by the sensing device.
Apple notes that on screen virtual controls could be presented to the user. "The sensing device detects the movements of the control entity in a three-dimensional space, such as a user's hand manipulating the virtual control, and translates them into commands for the remote device." You could also imagine in a gaming scenario, a user will be able to wield a sword to kill a dragon or engage in a Star Wars type of battle.
Later Apple notes that as an example, movement of the control entity using a circular gesture may be interpreted by the remote device as a command to adjust a magnification (or zoom) level of a remote information input interface comprising the user interface elements on the display. 'Magnification' in this context is not limited to simple visual magnification." The 'Magnification' could also be a circular control for volume with Apple Music or other value.
The possibilities are endless, but until Apple provides more specific examples, the rest would be speculative. Will it accept thumbs up or down in a game for instant voting? Will wave-gestures be used to flip through movie options or an online manual?
For now, Apple continues their work on this project that we hope will make it to market. But like all patent filings, some will be adopted and a many not. .
The good news is that part one of the invention used for Animoji has proven that Apple has nailed facial expression recognition with this same TrueDepth camera and it appears that Apple has another few ideas for the camera in the future that no longer sounds farfetched.
Apple's latest patent application for this was filed back in November 2017. While the PrimeSense patents go back to 2009 when the project began, this is first with this particular patent figure under Apple Inc. Other patents on this theme can be found in our 3D archives. As always, considering it's a patent application, the timing of such a product to market remains unknown.
Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.
Comments