Apple Reveals Adding the Kalman Filter to advance 'Maps' accuracy for iDevices & especially moving vehicles with CarPlay
Yesterday the U.S. Patent Office published a patent application from Apple that generally relates to estimating the position and/or orientation of a device that could provide users with more accurate mapping information on a device like the iPhone and more importantly CarPlay in vehicles (Car, truck, motorcycle, watercraft, aircraft etc). It could track people and vehicles movement with more accuracy when travelling.
Apple's system will use traditional technologies for their mapping system like GPS/GNSS, visual inertial odometry module and inertial measurement unit (IMU) as they have in the past for Maps. What appears to be new for Maps is the use of a Kalman Filter.
Apple's patent application states that "In one or more implementations, the architecture may provide for improved estimates of device position, for example, for use by an augmented reality application (Likely Maps and Maps on CarPlay).
The improved estimates may be used in presenting digital content (e.g., visual, audio and/or tactile feedback) in images of a real-world environment (e.g., as being captured by the image sensor). For example, the timing and/or positioning of notifications (e.g., prompts, overlays, audio cues, tactile feedback and the like) may be based on the estimates provided by the architecture.
One example architecture may include a GNSS receiver and the visual inertial odometry module presented in patent FIG. 3 below used for estimating device position and/or orientation.
In addition, the architecture may include an extended Kalman filter. As shown in FIG. 5 below, the extended Kalman filter may receive signals from the GNSS receiver and the visual inertial odometry module as input, and may provide an estimated device position as output.
In the big picture, a Kalman filter is an optimal estimation algorithm used to estimate states of a system from indirect and uncertain measurements.
Common Kalman filter applications include guidance and navigation systems, computer vision system and signal processing. These are the applications that Apple will use the Kalman filter for in future Mapping systems and application. One of the very first applications of the Kalman filter was NASA's Apollo project.
Below is a brief video about the Kalman filter. It's a simple way of understanding how it works. While the instructor's voice is somewhat irritating, she delivers the basics of the Kalman filter well. The video is set to begin where I think it's relevant to understanding Apple's application of the Kalman filter.
There are a few other videos available on the Kalman filter that "techies" may review. A basics video could be found here. For engineer types, you may prefer these videos: 01 and 02.
For more details on the AR application and image data, review Apple's patent application 20200348143 that was originally filed in Q3 2019. The filling is supported by Apple's provisional patent (not public) filed in Q2 2019.
Technically speaking, Patently Apple posted a report in February 2020 titled covering the addition of Machine Learning to combat Urban Canyons Issues. Going back to the original patent filing 20200049837 this morning I was able to see that this is where the "Kalman filter" was first introduced and we didn't highlight it in our February report. You could tell that it's part of the same project as the two patents cover a common patent FIG. 1.
With that said, yesterday's patent filing focused on an "Extended Kalman Filter" (as shown in patent FIG. 5) which takes the filter to the next level as it focuses on moving vehicles. The February patent covers its application for the iPhone (or iOS).