Back in late March Apple acquired indoor-GPS Company called WifiSLAM signaling that Apple could be entering the indoor mobile location services business sometime in the future. WifiSLAM's software allows a user's smartphone to pinpoint its location (and the location of friends) in real-time to 2.5m accuracy using only ambient WiFi signals that are already present in buildings. Today, a key patent application from Apple titled "3D Position Tracking for Panoramic Imagery Navigation" was published by the US Patent and Trademark Office. On one hand, the patent is about bringing detailed street-views to Apple's Map application. On the other hand, we get a smidgen of understanding as to why Apple acquired WifiSLAM. Apple's patent filing is very stingy in providing us with details about the second half of their invention. Apple states that "In some implementations, forward and backward translation enables the user to enter an indoor panorama of a structure (e.g., a commercial venue), " like a store for making a purchase. Although Apple has recently taken a bruising over their Map application's turn-by-turn navigation inaccuracies, it's crystal clear that they're charging ahead with determination to bring newly advanced services to Maps in the future.
Apple's Patent Background
Street-level imaging software provides panoramic views from various positions along streets throughout the world. Conventional street-level viewing applications or Web-based street-level viewing services allow a user to rotate within a panoramic "bubble" to view a particular street location from all directions. The user can rotate in the bubble using a navigation control and an input device (e.g., a mouse) or finger. To turn a street corner and enter a another street (e.g., a street intersection), the user has to "jump" to a panoramic "bubble" at the intersection then pan in the bubble to face in the direction of the target street. This can be a tedious experience for a user of a handheld device that needs to navigate streets of a neighborhood quickly.
Apple Invents a Wild Position Tracking Subsystem Supporting Indoor or Outdoor Panoramic Imagery
Apple's invention is about position tracking subsystems and onboard sensors that will enable a mobile device to navigate virtually in a location in panoramic imagery. Physically moving the device through space provides translation data that can be used to move up or down a virtual street or other navigation actions. In some implementations, forward and backward translation enables the user to enter an indoor panorama of a structure (e.g., a commercial venue). When the observer is inside the structure, forward/backward translation could perform other actions, such as selecting an object for purchase, etc.
In some implementations, forward/backward translation enables the user to enter an intersection and navigate a turn onto another street at the intersection. In some implementations, information or an information layer can be displayed when translating. In some implementations, distance data can be used to move up or down a street a particular distance. Distance data can be obtained by integrating acceleration readings from a motion sensor (e.g., accelerometers) onboard the device. Distance data can also be obtained using an onboard camera by measuring translation of the device from image sensor data. For both motion and image sensors, the distance can be relative or absolute depending on the output of the motion or image sensors. The distance data can be scaled to a virtual distance in the panoramic scene. Alternatively, optical flow can be used to determine distance data.
Apple Working on Bringing Street-View to Maps
Apple's patent FIG. 1A illustrates an exemplary GUI for navigating panoramic imagery based on sensed linear motion (translation) of a mobile device such as an iPhone or iPad.
In the example shown, the iPhone has been rotated by a user into a landscape orientation. The user has entered into a street-level view at an intersection of Broadway and Main Street of a fictitious city. The user can enter the street-level view in a variety of ways. For example, the user could click an icon (e.g., a pushpin) on a map to enter a street-level view at the location of the icon on the map. The user could automatically enter street-level view by zooming into a particular location on a map or satellite image.
Apple's Patent FIG. 1B illustrates the result of the user's +Y translation of device. The +Y translation resulted in the observer automatically navigating the corner of South Main Street and West Broadway to face in the direction of West Broadway. The user can now move the device forward or backward to move the observer up and down West Broadway. When moving forward or backward, information 103a can be displayed in the user interface.
In this example, a bubble was displayed for identifying a building (e.g., identifying the post office) on West Broadway. To prevent information clutter in GUI 101, information can be displayed or hidden as the observer moves up or down the street based on the observer's location and perspective in the panoramic imagery. In some implementations, information is displayed after a period of time has elapsed without the observer moving.
Information can be aggregated into information layers. When an observer is at a particular location on the street or has a particular perspective in the panoramic imagery, an information layer containing information of an information type (e.g., business information) can be displayed over the panoramic imagery.
In Apple's patent FIG. 1C, as the observer moves down West Broadway resulting from a forward transition from an original position (indicated by the dashed line), information 103b (e.g., identifying a hospital) is displayed, since the observer has moved closer to the hospital. In some implementations, a threshold can be set by a user or application based on the distance between the observer and a structure or object in the panoramic imagery. When the threshold distance is reached or exceeded, information or an information layer can be displayed or hidden.
Apple credits iOS engineer Patrick Piemonte and web production designer Billy Chen as the inventors of this patent application which was originally filed in Q3 2011. To review Apple's patent claims and details, see patent 20130083055. Considering that this is a patent application, the timing to market of such an Apple product is unknown.
A Note for Tech Sites Covering our Report: We ask tech sites covering our report to kindly limit the use of our graphics to one image. Thanking you in advance for your cooperation.
Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.