Apple envisions an iPhone being used as a 3D Controller for VR games on a future Headset
Today the US Patent & Trademark Office published an Apple patent that covers how a future iPhone may be used as a 3D controller for playing VR games when wearing a future VR headset. The invention also covers an iPhone being used as a 3D pointer and more.
In their patent background, Apple notes that to enable user interactions with electronic content, it may be desirable to enable a user to provide input via a separate real-world device, such as an iPhone's (or iPad's) the touch screen. However, existing systems are unable to adequately track the locations of a smartphone or tablet relative to the content in the CGR environment. This is what Apple's granted patent solves.
Apple's patent covers devices, systems, and methods that provide improved user interfaces for interacting with electronic content using multiple electronic devices.
Some implementations involve a first device (e.g., a head-mounted device (HMD)) that has an image sensor (e.g., a camera) and one or more displays, as well as a second device (e.g., an iPhone) that has a display.
Apple's patent FIG. 2 below illustrates a user being able to use an iPhone as an input device in computer-generated reality (CGR) environments while wearing a Mixed Reality Headset.
Apple's patent FIG. 3A above illustrates a marker (#310) displayed by the physical second device (#130 an iPhone) as noted in FIG. 2. In some implementations, the user is unable to view the physical display of their iPhone because the user is immersed in the virtual scene (#205). Accordingly, in some implementations, the iPhone displays a marker (#310) on the physical display of the iPhone to facilitate tracking of the iPhone by the Head Mounted Display (HMD).
In some implementations, the marker serves as a reference point for the HMD to accurately track the location and rotation of the iPhone.
In some implementations, the marker (#310) is an image containing texture/information that allows the image to be detected and makes it possible to determine the image's pose with respect to a camera.
Today Apple updated this patent/invention with 20 new patent claims that places emphasis on points not previously covered in order to strengthen the patent from patent trolls and competitor challenges.
There are 12 new patent claims supporting "A Method." Below are just a few examples:
New Patent Claim #1: "A method comprising: at a first device comprising a processor, a computer-readable storage medium, an image sensor, and a first display: obtaining an image of a physical environment using the image sensor, the physical environment comprising a second device comprising a sensor configured to track a position and orientation of the second device; receiving data corresponding to the tracked position or orientation of the second device from the second device; determining a relative position and orientation of the second device to the first device based on the received data; and generating a control signal based on the relative position and orientation of the second device.
New Patent Claim #2: The method of claim 1, wherein the generated control signal is based on input on the second device, wherein the first device uses the relative position and orientation of the second device to enable the second device to be used as a three-dimensional (3D) controller, a 3D pointer, or a user interface input device."
New Patent Claim #7: "The method of claim 1, wherein the sensor is an Inertial Measurement Unit (IMU)."
To review this patent in general and the new patent claims specifically, see Apple's continuation patent 20210263584.
Apple's patent lists a single inventor, Selim BenHimane, a Senior Engineering Manager – Computer Vision & Machine Learning. BenHimane is leading teams developing 3D Computer Vision and Machine Learning algorithms in the field of Augmented Reality and Virtual Reality. BenHimane previously worked at Intel for 4.5 years and Metaio GmbH for 4.5 years. Apple acquired Metaio back in June 2015.
Comments