Apple Invents First Advanced Navigation Guidance System for the Blind that could Eliminate Cane or Dog Guidance
Last week the US Patent & Trademark Office published a patent application from Apple titled "Guidance Device for the Sensory Impaired." The inventor is noted as being Chananiel Weinraub from Israel. With the exception of this invention, which clearly points to Apple's iDevices being instrumental, Weinraub has no official connection to Apple at present on LinkedIn or on previous inventions. More than half of his patents on record were assigned to Texas Instruments.
This particular invention relates to guidance devices to be used by those who are sensory impaired. One such guidance device noted in the invention includes the use of Smart Clothing, a new category of technology that Apple is investigating as noted in specific archive.
People use a variety of senses to navigate and interact with the various environments they encounter on a daily basis. For example, people use their senses of sight and sound to navigate in their homes, on the street, through workplaces and shopping centers, and so on. Such environments may be designed and configured under the assumption that people will be able to use senses such as sight and sound for navigation.
However, many people are sensory impaired in one way or another. People may be deaf or at least partially auditorily impaired, blind or at least partially visually impaired, and so on. By way of example, the World Health Organization estimated in April of 2012 that 285 million people were visually impaired. Of these 285 million people, 246 million were estimated as having low vision and 39 million were estimated to be blind. Navigation through environments designed and configured for those lacking sensory impairment may be challenging or difficult for the sensory impaired.
Some sensory impaired people use guidance devices or relationships to assist them in navigating and interacting with their environments. For example, some blind people may use a cane in order to navigate and interact with an environment. Others may use a guide animal.
Apple's Invention: Guidance Devices
Apple's invention relates to guidance devices for sensory impaired users. Sensor data may be obtained regarding an environment. A model of the environment may be generated and the model may be mapped at least to an input/output touch surface. Tactile output and/or other output may be provided to a user based at least on the mapping. In this way, a sensory impaired user may be able to navigate and/or interact with an environment utilizing the guidance device.
The invention is designed to permit a sensory-impaired user to quickly and efficiently interact with his or her environment. Sensory-impaired users may use a device which provides guidance to the user that communicates information about the environment to aid the user's interaction, far beyond a mere cane.
In the big picture, the invention is a guidance device that is designed to provide superior assistance than and/or take the place of a cane, a guidance animal, and/or other guidance devices and/or relationships.
The device may detect information about the environment, model the environment based on the information, and present guidance output based on the model in a fashion detectable by the user. Such guidance output may be tactile so the user can quickly and efficiently "feel" the guidance output while interacting with the environment. This device may enable the sensory-impaired user to more quickly and efficiently interact with his or her environment than is possible with existing sensory-impaired guidance devices such as canes.
In various embodiments, a guidance device for a sensory impaired user may include an input/output touch surface, a sensor data component that obtains data regarding an environment around the guidance device, and a processing unit coupled to the input/output touch surface and the sensor data component.
The processing unit may generate a model of the environment based at least on the data, map the model to the input/output touch surface and provide tactile output to a user based at least on the mapping via the input/output touch surface.
In some examples, the tactile output may be an arrangement of raised portions of the input/output touch surface or other tactile feedback configured to produce a tactile sensation of bumps.
In various examples, the processing unit may provide at least one audio notification based at least on the model via an audio component of the guidance device or another electronic device.
In some examples, the tactile output may include an indication of a height of an object in the environment. In various examples, the tactile output may include an indication that the object is traveling in a course that will connect with a user (which may be determined using real time calculations). In some examples, the tactile output may include an indication that the user is approaching the object and the object is below a head height of the user.
In some examples, the processing unit may provide an audio notification via an audio component upon determining that the assistance device experiences a fall event during use.
In various embodiments beyond known iDevices, an environmental exploration device may include a cylindrical housing, a processing unit located within the cylindrical housing, a touch sensing device coupled to the processing unit and positioned over the cylindrical housing, a haptic device (such as one or more piezoelectric cells) coupled to the processing unit and positioned adjacent to the touch sensing device, and an image sensor coupled to the processing unit that detects image data about an area around the cylindrical housing, as noted in patent FIG. 4A below.
Another guidance device design beyond known iDevices is one involving the use of a smart garment. Apple's patent FIG. 9 presented above illustrates the smart garment which includes one or more cameras (#913 and #914) or other sensors, an input/output touch surface #911, and/or various other components.
The item of apparel may provide guidance to a user by performing a method. As shown, the input/output touch surface may be in contact with a user's back when the item of apparel is worn. Thus, the user may feel tactile output related to guidance provided by the item of apparel without other people being able to visibly detect that the user is receiving guidance.
The processing unit may analyze the image data using image recognition to identify an object (and/or analyze data from one or more depth sensors to determine distance to and/or speed of moving objects), creates an output image representing the object and positional information regarding the object in the area, map the output image to the haptic device; and provide the output image as tactile output to a user via the haptic device.
In various examples, the processing unit may determine details of a hand of the user that is touching the touch sensing device and map the output image to the haptic device in accordance with whether the hand is a left hand of the user, a right hand of the user, has a large palm size, has a small palm size, has less than four fingers, or does not have a thumb.
Apple's patent FIGS. 1 and 2 illustrate users navigating example environments using a guidance device.
In patent FIG. 1 above, the input/output touch surface may provide tactile output via the raised bumps to indicate shapes or textures of objects in the environment. For example, the raised bumps indicate the shape of the truck #107 in the environment #100 of FIG. 1 and the raised bumps indicate the shape of the traffic signal #106 in the environment. The user may be able to feel the shapes of the raised bumps on the device and understand that the truck and traffic signal are present.
Overview of Guidance System
Apple's FIG. 4B presented below presents a block diagram illustrating functional relationships of example components of the guidance device #101 of FIG. 4A. As shown, in various example implementations the guidance device may include one or more processing units #424, batteries #423, communication units #425, positional sensors #426, speakers #427, microphones #428, navigation systems #429, image sensors #413 and #414, tactile input/output surfaces #411, and so on.
The guidance device may also include one or more additional components not shown, such as one or more non-transitory storage media (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on).
Apple's patent FIG. 4C presented above shows us a diagram illustrating an example configuration of the input/output touch surface #411. As shown, the input/output touch surface may be positioned on the housing #410. The input/output touch surface may include a number of layers such as a tactile feedback layer #411B (such as piezoelectric cells, vibration actuators, and so on) and a touch layer #411C (such as a capacitive touch sensing layer, a resistive touch sensing layer, and so on).
The input/output touch surface may also include a coating (which may be formed of plastic or other material that may be more flexible than materials such as glass), which may function to protect the input/output touch surface.
Privacy and Biometrics
Lastly, Apple notes that their invention also contemplates embodiments in which users selectively block the use of, or access to, personal information data, including biometric data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data.
For example, in the case of biometric authentication methods, the present technology can be configured to allow users to optionally bypass biometric authentication steps by providing secure information such as passwords, personal identification numbers (PINS), touch gestures, or other authentication methods, alone or in combination, known to those of skill in the art.
In another example, users can select to remove, disable, or restrict access to certain health-related applications collecting users' personal health or fitness data.
Apple's patent application was filed back in Q1 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
In the bigger picture, Chananiel Weinraub's patent archives shows us that this invention was first filed in the U.S. in July 2015
Patently Apple presents a detailed summary of patent applications and/or granted patents with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.