Apple Patent describes Smartglasses and an MR Headset using advanced Eye Tracking for use in VR Gaming, Flight Simulator Training+
Today the US Patent & Trademark Office published a patent application from Apple that relates to a method for providing information about a user behavior of a user with regard to at least one reference object, especially a virtual reference object, via a network from smartglasses or MR HMD to a second device such as a Mac or iPhone/iPad. The invention also covers a system for providing information about a user's behavior in a particular virtual scene. The invention applies to Mixed Reality Environments that could be used in gaming, flight simulation training and much more.
Transmitting Mobile HMD Gaze Data
Apple's invention especially applies in the field of virtual reality and eye tracking systems. Virtual reality can advantageously be used for a great variety of different applications.
Apart from games and entertainment, virtual reality especially in combination with eye tracking can also be used for market research, scientific research, training of persons, and so forth. For example eye tracking data advantageously can provide information about where a user, who is currently experiencing the virtual environment, is looking at within this virtual environment. So, for example for market research one can use virtual environment in combination with eye tracking to analyze for example which objects, which are presented as virtual objects within the virtual environment, e.g. a virtual supermarket, attract more or less attention of the user.
Also the combination of the virtual environment and an eye tracking can be used for training purposes, e.g. by simulating a virtual training situation, e.g. in form of a flight simulator or a vehicle simulator, and using the captured eye tracking data to analyze whether the user had looked at the correct objects or important instruments or was attentive or not or is tired, and so on. Especially in such situations it would be very desirable to be able to share such a virtual reality user experience also with third parties, like an observer, an instructor or supervisor, who wants to observe or analyze the behavior of the user and the user interaction with the virtual environment or also to give instructions, advice or recommendations to the user that is currently experiencing the virtual environment.
Apple's patent FIG. 2 below a schematic illustration of a system for providing information about a user behavior with regard to a reference object via a network from a first device to a second device
In one example, Apple notes that when the user associated with the first device (#14 smartglasses) moves and interacts with a known virtual environment, which is displayed in form of the virtual scene VRS, e.g. when playing a game or walking through a virtual supermarket, it is only necessary to make information about the user's current state available on the second device (#16 Mac/computer) to recreate the user experience on the second device. The recreation may also be intentionally altered, e.g. upscaling or downscaling the resolution, for example in the region of the virtual scene VRS that comprises the user's current gaze point. In both a static and interactive virtual environment the unknown component is how the user moves and interacts with it, where the known component is the virtual environment itself.
For more details, review Apple's patent application number US 20220404916 A1.
- Tom Sengelaub: Senior Engineering Manager - Computer Vision
- Julia Benndorf: Software Engineer
- Marvin (Vogel) Klinkhardt: Computer Vision Engineer
All three inventors came to Apple when SMI SensoMotoric Instruments GmbH was acquired by Apple in 2017. SMI was a global leader in eye tracking technology. In 2015 SMI presented their first smartglasses with eye tracking at Siggraph as presented in the video below.