Apple's long-awaited Mac Pro update won't arrive before 2019
Apple will release a Final Cut Pro X update with ProRes RAW and advanced Closed Captioning on April 9th

Apple Reveals Inventions that Cover LIDAR Detection and a Realtime Heads-Up Display for Future Vehicles

1 cover Apple heads-up display and system patent ++

 

Apple's initial work on Project Titan finally began trickling out in published patents in 2018. Patently Apple recognized this early on and opened an archive dedicated to autonomous vehicle technology. We've covered six patents to date and today we'll be covering Apple's seventh and eighth patents related to autonomous vehicles and the use of realtime augmented reality for use with a heads-up display.

 

When I originally thought about autonomous vehicles I mainly thought of cabs in the intercity. I rarely use cruise control on the highway because it means giving up some form of control of my vehicle. The mere thought that anyone would actually trust an autonomous car on a highway at highway speeds with twists and turns and crazy traffic cutting in and out, I just shake my head. Hearing about crashes with such vehicles only confirms my greatest fears. These vehicles are not ready for prime-time.

 

While others like Tesla are racing to get autonomous vehicles in the market, it's great to see that Apple is taking their time to work out a million issues. Who cares if it takes a decade or more to come to market, what's the rush?

 

Many of Apple's patents thus far clearly show the dangers that they're working to alleviate from daytime glare interfering with the vehicle's visual systems through to driving in the dark at night or in a tunnel during the day.

 

Apple is thinking in new ways about incorporating heads-up displays and a custom gesturing system right through to autonomous vehicles being windowless for safety reasons; where users need an advanced mixed reality headset to just to see the outside world while in such a vehicle. They're also thinking on how to entertain passengers including first-person VR experiences. Now on to the following new patent filings. 

 

Patent Application:

Enabling LIDAR Detection

 

In another Apple patent published by the U.S. Patent Office on March 22, Apple describes enabling lidar on a vehicle.

 

Apple notes that roads or road signs include reflective materials, such as reflective paint or attachments, to improve their optical visibility by reflecting light. Lane markers generally include a reflective paint in addition to physical bumps to ensure that drivers can be made aware of the lane's outer bounds even in low-light situations. License plates on vehicles also include reflective materials to better illuminate the text on the license plate to be visible to other drivers, including police officers.

 

Autonomous vehicles include numerous sensors configured to detect obstacles that may appear while driving. These obstacles may include other vehicles driving along the same road.

 

Vehicles on the road may be detected by the sensors, such as a light detection and ranging (lidar) sensor or a radar sensor. The sensors may generally be able to detect a vehicle by determining that a lidar signal or a radar signal has been reflected by the vehicle.

 

The sensors may not necessarily be able to determine that the obstacle is a vehicle by simply having reflected signals. Detectability of other vehicles on the road can be improved by making the sensors more effective by improving usability of signals detectable by the sensors.

 

Apple's invention covers this issue with systems and methods for enabling lidar detection on a vehicle that may include a light source configured to emit a light signal, a receiver sensor configured to receive a reflected light signal based at least in part on the light signal reflected from a plurality of reflectors and a controller. The controller may be configured to identify an arrangement pattern of the plurality of reflectors based at least in part on the reflected light signal and determine that plurality of reflectors are coupled to another vehicle based at least in part on an identification of the arrangement pattern.

 

Apple's patent FIG. 1 illustrates a block diagram of a vehicle having one or more sensors configured to detect another vehicle.

 

2  Apple lidar sensors

Apple's patent FIG. 2a illustrates a side view of a sensor configured to send a signal to a plurality of reflectors embedded in a vehicle; FIGS. 2b to 2d illustrate various types of reflectors configured to reflect signals; FIGS. 3a to 3c illustrate block diagrams of a vehicle having multiple patterns of pluralities of reflectors identifying multiple orientations of the vehicle, according to some embodiments.

 

Apple's patent application 20180081058 was published on March 22, 2018 and originally filed with the U.S. Patent Office in Q3 2017.

 

Patent Application:

Adaptive Vehicle AR Display using 3D Imagery

 

In another Apple patent application published last week mainly related to autonomous vehicle systems we learn that augmented reality (AR) systems may utilize remote sensing devices to provide depth information about objects in an environment.

 

In some scenarios, laser based sensing technologies, such as light ranging and detection (LiDAR), can provide high resolution environmental data, such as depth maps, which may indicate the proximity of different objects to the LiDAR.

 

Real-time augmented reality faces a variety of challenges when it is a primary display technology in a vehicle traveling at various speeds and angles through ever changing environments. Weather conditions, sunlight, and vehicle kinematics, are just a few of the elements that may impact the rendering but that also limit a system's overall capabilities. This is especially true since on-board sensors have a fixed range and often require algorithms for optimizing queries which impact overall quality and response time.

 

Apple's invention covers methods and systems to be used in augmented reality (AR) displays in vehicles – more commonly known as a heads-up display. Embodiments of an AR system are described that leverage a pre-generated stereographic reconstruction or 3D model of the world to aid in the anchoring and improve rendering of an AR scene. By leveraging the stereographic reconstruction of the world, embodiments of the AR system may use a variety of techniques to enhance the rendering capabilities of the system. In embodiments, an AR system may obtain pre-generated 3D data (e.g., 3D tiles) from a stereographic reconstruction of the world generated using real-world images collected from a large number of sources over time, and may use this pre-generated 3D data (e.g., a combination of 3D mesh, textures, and other geometry information) to determine much more information about a scene than is available from local sources (e.g., a point cloud of data collected by vehicle sensors) which AR rendering can benefit from.

 

Embodiments of an AR system are described that may use three-dimensional (3D) mesh map data (e.g., 3D tiles reconstructed from aerial/street photography) to augment or complement vehicle sensor (e.g., LiDAR or camera) information on a heads-up display.

 

The 3D tiles can be used to fill in for limitations of the sensors (e.g., areas of the real environment that are occluded by buildings or terrain, or are out of range) to extend the AR into the full real environment in front of the vehicle (i.e., within the driver's field of vision).

 

For example, a route may be displayed, including parts of the route that are occluded by objects or terrain in the real environment.

 

The pre-generated 3D mesh map data may be available for the entire real environment, 360.degree. around the vehicle, behind occlusions, and beyond the horizon. Thus, in some embodiments, the 3D mesh map data may be leveraged to provide information about the environment, including objects that are not visible, to the sides and behind the vehicle.

 

In some embodiments, the 3D mesh map data may be used by the AR system in poor/limited visibility driving conditions, e.g. heavy fog, snow, curvy mountain roads, etc., in which the sensor range may be limited, for example to project the route in front of the vehicle onto the AR display. For example, the 3D mesh map data may be used to augment sensor data by showing upcoming curves or intersections.

 

Apple's patent FIG. 2 below illustrates an adaptive augmented reality (AR) system and display (heads-up display); FIG. 6 illustrates an example adaptive AR display (heads-up display).

 

3 - apple heads-up display

Apple's patent FIG. 5 below illustrates a 3D mesh. FIG. 5 also shows virtual content (a route ribbon) overlaid on the mesh that includes a nearby, visible portion, and\ occluded portion, and a distant portion.

 

4 Apple AR system

Apple's patent application #20180089899 was filed back in Q3 2017 and published last week by the U.S. Patent Office. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

 

10.0BB Patent Notice Bar

Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.

Comments

The comments to this entry are closed.