A Project Titan Patent reveals Autonomous Vehicles that can communicate with surrounding Vehicles & Pedestrians
The European Patent Office published another Project Titan patent application from Apple that relates to communicating actions of an autonomous vehicle to the public, such as in a crosswalk, or vehicles behind them.
Apple notes that in a vehicle operated by a human driver, the driver's intentions may be conveyed to other individuals, such as other drivers and pedestrians, through a combination of driver-directed vehicular signals (e.g., horn, turn indicator, flashing headlights) and physical signals such as hand gestures or eye contact.
However, in a semi- or fully-autonomous vehicle, in which the driver's attention may not be fully engaged in the operation of the vehicle, other vehicles and pedestrians may lack awareness of the intended actions of the autonomous vehicle. Apple's invention is aimed at correcting this.
Last week Patently Apple posted a video from a company that Apple acquired called Drive.ai. In the video it shows how an autonomous shuttle service vehicle had signage on the vehicle that would communicate to the public.
A secondary video is shown below that shows how signage on the rear of the autonomous vehicle communicates with a vehicle in its rear that the shuttle has stopped to allow a pedestrian to cross the street. Our cover graphic shows that digital sign prominently.
The Drive.ai method was part of their retrofit kit that they sold to existing shuttle vehicles. Being a retrofit, the digital signage was boxy and attached to the exterior of a vehicle. Apple's solution would be smartly built right into all slides of the vehicle. Yet the Drive.ai videos help a reader to understand Apple's latest patent filing regarding a new external communications system.
In order for autonomous or semi-autonomous vehicles to accurately maneuver anywhere in the US, Apple revealed during their WWDC 2019 keynote last week that they've remapped the entire US with finer detail and be finished by the end of 2019. This is needed to assist a future autonomous vehicle from Apple. Apple has to rely on accurate mapping and not a third party mapping service.
In Apple's patent FIG. 1 below illustrates a diagram illustrating vehicles and extra-vehicular objects in a transportation system.
More specifically, Apple's patent FIG. 1 illustrates a transportation system #100 that includes a vehicle transportation network #110 and a vehicle #120. The vehicle transportation network may include paths, routes, roads, streets, highways, thoroughfares, railways, bridges, overpasses, or any surface that may be traversed by a vehicle such as the vehicle #120. In some embodiments, the vehicle may be autonomous or self-driving and may include a controller apparatus #122 that may incorporate or be associated with a sensor #124.
The sensor may generate sensor data by detecting the presence, state, or condition of a portion of the transportation system including the vehicle transportation network, the vehicle, or extra-vehicular objects such as a vehicle #130, a vehicle #132, or a building #134.
As an example, the sensor may include sensors such as an accelerometer, a gyroscope, a still image camera, a video camera, an infrared sensor, a light detection and ranging (LIDAR) system, a radar system, a sonar system, a thermometer, a barometer, a moisture sensor, a vibration sensor, a capacitive input sensor, or a resistive input sensor.
Autonomous Vehicles Equipped with External Communications
Further, the controller apparatus may generate external communications (not shown) directed at extra-vehicular objects including external communications based on the sensor data from the sensor.
Apple's patent FIG. 2 below is a diagram illustrating a controller apparatus generating an external communication for extra-vehicular objects in a transportation system. As you can see in the patent figure, there are sound waves shown to right, left and rear of the vehicle.
The autonomous vehicle having sensors and cameras will be able to communicate messages to surrounding vehicles via audio messages through external speakers and/or by digital signage.
Apple notes that "the external communication by the [autonomous] vehicle may be accompanied by a visual communication such as a blinking light or the written message "attention, moving vehicle on your right," that may be displayed on one or more of the externally visible displays on the vehicle.
As the [autonomous] vehicle #220 approaches the traffic intersection and decelerates in order to come to a stop beside the stop sign #226, the controller apparatus may generate an audible communication such as "vehicle slowing down" to indicate the reduction in the velocity of the vehicle #220.
In another example, to indicate a reduction in the velocity of the [autonomous] vehicle, the controller apparatus may generate a visual indication, such as the written message "vehicle slowing down," or a real-time display of the vehicle velocity, such as an indication of kilometers per hour, on one of the externally visible displays on the vehicle.
Zoning Data
Apple adds that the controller apparatus may retrieve zoning data corresponding to the geographic location of the vehicle #120, and the external communication may be further based on the zoning data. The zoning data may include an indication of the way that a geographic area is zoned, such as a school zone, a residential zone, or an industrial zone. The controller apparatus may determine the communication type or the communication magnitude based on the zoning data. In an implementation, an audible communication or a visual communication generated in a school zone may use simpler language better suited for children.
Hand Gestures
The controller apparatus can also generate a secondary external communication in response to the extra-vehicular response. As an example, after providing an external communication that the vehicle intends to move forward and responsive to detecting that an extra-vehicular object such as a pedestrian has stopped at an intersection and is providing feedback in the form of a hand gesture to indicate that the vehicle should move forward, the controller apparatus may generate a visual communication that displays "thank you" on a display portion of the vehicle that is visible to the pedestrian. In this way, the extra-vehicular object receives an acknowledgment of the extra-vehicular object's response to the external communication that was initially generated by the controller apparatus.
Communication Modes Change when Urgent
The controller apparatus may determine a communication magnitude for the external communication based on the communication factors. The controller apparatus may adjust a communication magnitude by modifying a frequency or an intensity of the external communication. In an implementation, the adjustment to the communication magnitude by the controller apparatus may include:
- Changing the volume or pitch of an auditory communication
- Changing the content of an auditory communication to include more urgent language
- Changing the intensity or color of a light; changing the frequency at which a light blinks or pulsates
- Changing the severity or urgency of a graphical display or textual message
Adjusting Communications in Neighborhoods+
In an implementation, the communication magnitude may be based on the time of day or the date so that the volume of an audible communication may be reduced during evening hours or on a Sunday.
In an implementation, the communication type or the communication magnitude may be based on the ambient sound level. For example, a lower ambient sound level, such as on an empty rural road at night, may result in a lower volume for an audible communication than when a higher ambient sound level is detected, such as on a busy city street at midday.
In an embodiment, when the ambient sound level is determined to be at a high level, an audible communication may be determined to be less effective, and another type of communication such as a visual communication may be generated. As an example, on a busy city street with many vehicles using horns, generating a visual communication such as a flashing light may be determined to be more effective.
The communication type or the communication magnitude may be based on whether the forward-facing sides of the extra-vehicular objects are oriented towards the vehicle. For example, if some of the extra-vehicular objects are determined to be pedestrians and the pedestrians are facing away from the vehicle, then a visual communication will not be seen by the pedestrians. As such, an audible communication type, such as a horn, may be used to attract the attention of the pedestrians.
Apple's patent FIG. 8 below is a flow chart of a method for external communication when an extra-vehicular object is on an extra-vehicular path that may intercept a vehicle's path.
Apple notes that their location component used in their autonomous vehicles may generate navigation data or geolocation data that may be used to determine a velocity, an orientation, a latitude, a longitude, or an altitude for the vehicle. The location component may include one or more navigation devices that are able to use navigational systems such as GPS, the long range navigation system (LORAN), the Wide Area Augmentation System (WAAS), or the global navigation satellite system (GLONASS).
Apple's patent application was filed in Europe in October 2017 and published on June 5, 2019. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.
Comments