Apple UK Video: Meet the Creators | Behind the Mac
Apple Takes Binaural Audio to the next-level by adding Head-Tracking for Next-Gen AirPods Pro and Future HMD

Apple's Third Patent Surfaces Relating to a Future Mixed Reality Windshield for Turn-by-Turn Directions and more

1 cover Smart MR Windshield

 

Two years ago, Patently Apple posted a patent report titled "Apple Invents an Augmented Reality Windshield that will even Support FaceTime Calls between Different Vehicles." It was the first patent on this aspect of Apple's Project Titan. Today the US Patent & Trademark Office published a patent from Apple that relates to a future Mixed Reality Display system, which in many cases in this patent, is a windshield of a mobile machine which they classify as an autonomous, semi-autonomous, or non-autonomous vehicle.

 

While the focus of the patent is the windshield of a vehicle, the system will also work with future iDevices like an iPhone or iPad when walking on a street or path.

 

In the background notes of Apple's patent, they state that when traveling utilizing an autonomous of computing machine, including robots, aerial vehicles, aerospace vehicles, submersible vehicles, automobiles, other ground vehicles, and the like, users may often reference their surroundings when developing plans or requesting information about the world around them.

 

However, it is often difficult to present requested information to the users of the autonomous machine in a manner that is intuitive to the users, i.e., presenting information of a three-dimensional space on a two-dimensional display.

 

Also, oftentimes the indications of the users to reference objects in the world around the machine may be imprecise, especially as the machine is moving through the environment, making it difficult to accurately determine what object within the world about which the user of the machine is attempting to receive information.

 

Apple's patent is to address these short comings of past systems with new methods, systems, and/or computing device for receiving a non-tactile selection through an input system of a real-world object in an external environment to the device, estimating an approximate geographic location of the device, and obtaining a subset of external real-world object indicators from a database of real-world objects that are located within the external environment for the estimated approximate geographic location of the device.

 

Apple's patent FIG. 4 below illustrates a first embodiment for displaying a representation of spatial objects relative to a machine in a user interface. In this example, the external environment, a main street, is visible through the display portion #104 which is the windshield of a vehicle.  AR is used to identify stores, streets and more.

 

In another embodiment, the user may provide a verbal description of the objects. For example, the user may describe "a building ahead on the left." In response, the system may select the potential objects intended by the user and provide the selected objects unto the windshield.

 

2 x smart windshield display fig. 4

(Click on Image to Enlarge)

 

In patent FIG. 6 below a vehicle is travelling down a road. Selected objects may be located on the right-hand side or left-hand side of the road. The inclusion of the road indicator #604 in the visual representation of the external environment #602 provides an indication of which side of the road the selected objects are located in relation to the position of the vehicle.

 

3 xx smart windshield display fig. 6 JPG

(Click on Image to Enlarge)

 

Elsewhere in the patent Apple notes that a gesture can be made by the user or driver in the direction of a building, such as pointing or waving at the building. In another example, the gesture may include gazing upon or in the general vicinity of the building to select or indicate the building. The gesture may be detected by a visual sensor, such as an infra-red camera or RGB camera, which obtains the movements or gazes of the user. The visual sensor may be located within the vehicle and configured to detect movements or gazes of the passengers of the vehicle to an object that is external to the vehicle.

 

Lastly, the movement of the indicators on the windshield may be determined by a system which calculates an estimated location of the vehicle and surrounding buildings.

 

In one implementation, the location of the vehicle is determined through a GPS device and the location of the buildings is obtained from a database (likely iCloud for Maps) of known object locations.

 

Although the patent is described as a continuation patent, in this case it's complicated as Apple is mixing two patents into one with a different title. So, we're treating it as a new patent application.  Patent application 20200233212 published today by the U.S. Patent Office was mixed with granted patent 10,558,037 titled "Systems and methods for determining an object through a disambiguated selection in a mixed reality interface."

 

Considering that this is a continuation patent, the timing of such a product to market is unknown at this time.

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.