A new patent application from Apple was published today in Europe. Apple's 120-page document which includes 54 detailed graphics describes how Apple's Mapping App will be able to function in varying types of vehicle infotainment systems. The system will also be able to function plugging an iPhone or iPad into a vehicle's dashboard display and function with touchscreens and non-touchscreen systems. The Mapping App will be able to function differently if not adapt to whatever type of system it's connected to. The system will understand whether your iDevice is connected to A low or high-quality vehicle infotainment system display and whether the system uses dials or a keypad. We know that Apple is working with dozens of car manufactures to integrate CarPlay, yet Apple's invention gives us the impression that they're working on ways to make CarPlay that's running on your iDevice work with less sophisticated in-vehicle infotainment systems.
Apple's Patent Background
Portable media devices, such as smartphones, have the capability to run advanced mapping and navigation applications (e.g., Apple Maps, which operates on the iPhone, iPad, and iPad Mini). Some of these mapping and navigation applications include tum-by tum navigation features, which can be helpful while driving; though interacting with a mapping and navigation application while driving is another matter. That may prove to be difficult due to the small display size of many mobile devices.
In addition, many vehicles include in-car navigation systems. These in-car navigation systems operate independently of any of the driver's other devices, and offer a larger and conveniently positioned screen. However, these in-car navigations systems generally provide a more limited experience than the more robust mapping applications of the mobile device due to the inherent limitations of the vehicle.
Apple Invention: Mapping Application with Several User Interfaces
Apple's invention relates to an application that generates multiple user interfaces for display on multiple devices at the same time. In some embodiments, the application is an integrated mapping and navigation application that runs on a mobile device (e.g., a smart phone, tablet computer, media player, etc.) and generates both (i) a user interface for display on the mobile device and (ii) a user interface for display on a screen of a vehicle to which the mobile device connects.
The integrated mapping and navigation application (simply referred to below as a mapping application) generates both user interfaces simultaneously for simultaneous output and display.
In addition, the mapping application of some embodiments generates different user interfaces for display on the screens of different types of vehicles. Some embodiments generate different user interfaces for each different individual vehicle. On the other hand, some embodiments generate different user interfaces for categories of vehicle screens, such as high quality touchscreens, low-quality touchscreens, and non-touch screens (with which a user interacts via separate controls built into the vehicle).
The mapping application of some embodiments, when connected to a vehicle, identifies the type of display screen built into the vehicle, and automatically outputs the correct user interface for the vehicle.
A user of the mapping application may interact with the application via the vehicle interface as well as the mobile device interface (a touchscreen interface in some embodiments).
Because of the different capabilities of the different interfaces, as well as the differences in likely user behavior for interactions with the different interfaces, the same operation or type of operation may be implemented differently between the mobile device interface and the vehicle interface. For instance, the mobile device may have the capability to interpret multi-touch gestures (e.g., a pinch gesture to zoom in or out), whereas the vehicle interface may not have multi-touch capability (or any touch capability), and therefore requires different user interaction to zoom (e.g., selection of zoom in and zoom out buttons, either on the touchscreen or the vehicle interface).
Furthermore, because of the different capabilities of the different types of display screens, a user may interact differently with the application user interfaces displayed on high-quality touchscreens, low-quality touchscreens, and non-touchscreens. For instance, the interaction for scrolling through a map on a vehicle touchscreen may involve a similar swiping gesture as to scrolling through the map on a mobile device. However, a low-quality touchscreen may not have the ability to interpret such gestural input, and therefore the user interface for low-quality touchscreens includes selectable (e.g., via a tap input) arrows for scrolling in different directions. The non-touchscreen vehicle interface, of course, will require input through other controls (e.g. a joystick, buttons, etc.) that are built into the vehicle.
Beyond simply exploring a map (e.g., by scrolling and zooming), the vehicle interface output by the mapping application provides additional features in some embodiments. In some embodiments, the vehicle screen interface for the mapping application is geared towards identifying a destination for a user and entering a navigation mode for a route to that destination, with as little user interaction as possible (because the user is often the driver). For example, through the vehicle interface, a user (e.g., the driver of the vehicle, a passenger of the vehicle, etc.) may search for destinations on the map. The user may search for a specific address, a specific place name, a generic type of place name, etc. In some embodiments, the user searches through the vehicle interface via voice interaction (i.e., dictating a search into a microphone of either the mobile device or the vehicle). The user can scroll through these results in the vehicle interface (through touchscreen interactions, built-in vehicle control interactions, etc.), and choose to enter a navigation mode with a search result as a destination.
In addition, the mapping application of some embodiments offers a predictive routing feature accessible through the vehicle user interface. While driving, the user can select an option to enter the predictive routing mode, in which the mapping application presents various likely routes to the user for navigation. The mapping application may generate the likely routes based on a variety of factors, including upcoming appointments or events on a calendar or other scheduling application that runs on the mobile device, analysis of routes taken in the past by the mobile device (e.g., a route from a user's home to work). The predictive routing feature may additionally factor in traffic to identify potential difficulties in a usual route or in reaching a location on time.
The mapping application of some embodiments also offers a recent locations feature that provides a user with recent destinations, results of recent searches, etc. Some embodiments provide search results exclusively from recent searches entered or destinations navigated to through the vehicle interface. On the other hand, some embodiments additionally include search results from recent searches made through the device, even before the connection of the device to the vehicle interface. Thus, if a user searches for a particular destination on her mobile device while walking to her car, then enters the car and connects her device to the car interface, the particular destination will appear as a recent and easily selectable search, without requiring the user to re-enter the search.
Once the user selects a destination, the mapping application enters a tum-by-tum navigation mode in some embodiments. In this mode, some embodiments output different displays to the vehicle display and the mobile device display. The vehicle display, in some embodiments, displays the user's location and the upcoming route, in either a two dimensional mode or a three dimensional mode. The application of some embodiments generates this display from a perspective rendering position within a three dimensional navigation scene, though the view may be shown from directly above the scene so as to render a two dimensional view. The user can interact with the vehicle user interface to, e.g., view a list of maneuvers to make for the route (e.g., a right tum onto a particular street), change between two and three dimensions, and other interactions.
Furthermore, in some embodiments, when the vehicle reaches a location within a particular threshold of the next maneuver, a portion of the vehicle screen displays a representation for the maneuver (e.g., an intersection with an arrow that represents the vehicle's path through the intersection, as well as text directions for the maneuver). Once the vehicle has passed through the intersection, the representation of the maneuver disappears from the display screen of the vehicle. While the vehicle display shows the upcoming route on a map, the mobile device display of some embodiments subsequently shows a representation for the upcoming maneuver, irrespective of the distance for the vehicle to travel before making the maneuver.
The preceding Summary is intended to serve as a brief introduction to some embodiments of the invention. It is not meant to be an introduction or overview of all inventive subject matter disclosed in this document. The Detailed Description that follows and the Drawings that are referred to in the Detailed Description will further describe the embodiments described in the Summary as well as other embodiments. Accordingly, to understand all the embodiments described by this document, a full review of the Summary, Detailed Description and the Drawings is needed.
Apple's paten Figure 2 illustrates an example of a mobile device connected to the interface of a vehicle system. A mapping application operates on the mobile device (an iPhone #200), and outputs both a first user interface for the mobile device display screen and a second user interface # 210 for the vehicle dashboard display screen.
Apple's patent Figure 3 conceptually illustrates a simplified software architecture for a mapping and navigation application #300.
Apple's patent Figure 12 conceptually illustrates a region representing the map view area of the low-quality touchscreen user interface.
Apple's patent FIG. 13 conceptually illustrates a process performed by the mapping application of some embodiments in order to translate a selection input into a scroll of the map display for a low-quality touchscreen vehicle interface.
Apple's patent Figure 18 illustrates an example of map exploration in a non-touchscreen vehicle user interface of some embodiments.
Apple's patent Figure 39 illustrates the vehicle display screen over four stages in which a user activates a messaging function and dictates a message to a recipient - via Siri.
Apple's patent application was discovered today in a European Database. It was published today with the original filing in Europe noted as being made on March 12, 2014. Emanuele Vulcano is noted as the lead engineer for this application.
A Note for Tech Sites covering our Report: We ask tech sites covering our report to kindly limit the use of our graphics to one image. Thanking you in advance for your cooperation.
Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Comments are reviewed daily from 4am to 8pm MST and sporadically over the weekend.