Acquired Patents show us Two Applications that could be coming to Future AR Glasses, such a Super Zoom and …
Today the US Patent & Trademark Office published two patent applications that Apple may have acquired from Metaio back in 2015, as two Metaio engineers are listed on today's patents. The first is Daniel Kurz who was head of Advanced Technologies, Computer Vision and Augmented Reality. At Apple, Kurz is Senior Machine Learning Manager. The other former Metaio engineer, Ryan Burgoyne, worked on depth-camera tracking that was used in the Metaio SDK. At Apple he's a Software Development Engineer where he emphasizes as being "awesome software." The history of the patents is well hidden by Apple legal.
The two patents provide us with a possible peek at two applications that could be used by Apple's future HMD, likely in the form of glasses, though the patent steers away from using the term glasses anywhere.
To make it simple, the first patent is about some form of HMD, likely glasses, with AR/MR capabilities. The patent figures 2A and 2B below allow us to see a super zoom application with before and after images. In FIG. 2A a home in the country is seen in the distance (#210). In FIG. 2B we see that the glasses have allowed the user to zoom in on the home to make it much larger.
Another example, not shown here, illustrates a user seeing a deer in the distance, followed by the super zoom illustrating the deer having been magnified so that technically the user could take a photo of the deer at the closer range.
The zoom control could be set manually via a touch sensitive surface, via a hand-gesture or via a simple Siri command. The glasses will know what to zoom in on by tracking your eye-gaze positioning.
To the user, the zoom effect will appear as though they were looking through a telescope as they'll see a circular zoom effect playing out. Apple notes that "position changes smoothly from the first position to the second position by moving forward from the first position and orientation."
This same feature, to a certain extent, could also be made available to future iPhone and iPad devices in the future. This is part of patent application 20200273146, titled "Movement within an Environment."
The second patent application is titled "Rendering Object to Match Camera Noise." Patent FIG. 4 above provides you with a sense of what the patent is about. On the left, the user sees a real-world kitchen appliance. The user then films said object and the camera software will not only take the video of the object and surrounding while removing noise in the film such as graininess, poor lighting and more.
More specifically, Apple notes that some augmented reality (AR) systems capture a video stream and combine images of the video stream with virtual content. The images of the video stream can be very noisy, especially in lower light condition in which certain ISO settings are used to boost image brightness.
In contrast, since the virtual content renderer does not suffer from the physical limitations of the image capture device, there is little or no noise in the virtual content. In the AR content that combines the images of real content with the virtual content, noise can be seen on the real objects but is generally missing from the virtual content. For instance, if you click on the image to magnify the patent figure you'll be able to see that the time on the real-world image on the left can't be seen. In the Virtual version on the right you'll notice that the time is clearly in view showing 11:32.
Beyond this simpleton explanation, if you happen to be a budding VR/AR/MR developer, then you're really going to enjoy reading this patent and the journey that it takes you on describing the various processes it will use to produce VR content that is smooth and without artifacts/noise. For more on this, check out Apple's patent 20200273212 for more details.