Apple to Expand VoiceOver & Contextual Voice Commands in iOS
The US Patent and Trademark Office officially published over 40 new Apple patent applications today. One of the themes that emerged was that of expanding Apple's VoiceOver capabilities to assist those that are visually challenged along with the expansion of contextual voice commands that could find their way into future iOS based apps like GPS and video games.
Patent: Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface
To make it easier for the blind and those with low-vision to use a computer, Apple has built a solution into every Mac, called VoiceOver. It's reliable, simple to learn, and enjoyable to use, states Apple's website page dedicated to the subject of Accessibility.
Three of Apple's latest patents covering the technology and methodology behind their iPhone Accessibility program were published today. These in-depth patents cover every possible detail that you could possibly imagine and more.
Below are just three of the interfaces that are revealed in Apple's patents. The first illustration is that of Apple's Accessibility User Interface 500JJ (FIG. 5JJ). It depicts that in response to three-finger double tap gesture 580, the accessibility user interface has magnified the user interface so that application icons Stocks 149-2, Voice Memo 142, Alarm 149-4, and Dictionary 149-5 are larger, and other portions of the user interface are now not within the displayed portion of the user interface.
The second User Interface 500KK (FIG. 5KK) depicts that in response to three-finger movement gesture 582, the accessibility user interface has panned so that instead of displaying user interface application icons Stocks 149-2, Voice Memo 142, Alarm 149-4, and Dictionary 149-5, user interface application icons Photos 144 and Camera 143 are now visible in the upper portion of the display, and Stocks 149-2 and Voice Memo 142, which were in the upper portion of the display, are now in the lower portion of the display.
Additionally, User Interface 500KK also depicts use of a three-finger zoom gesture 584 to further magnify the user interface. Here, the gesture 584 includes a three-finger double tap on the touch-sensitive surface with three initial points of contact 584-1, 584-2, and 584-3. The second tap remains in contact with the touch-sensitive surface, and then moves 584-4, 584-5, and 584-6, towards the top of the touch-sensitive surface until the desired zoom level is reached.
Lastly, the third User Interface 500LL (FIG. 5LL) depicts that after three-finger zoom gesture 584, user interface application icon Voice Memo 142 is further magnified and now occupies most of the display.
Apple credits Christopher Fleizach and Eric Seymour as the inventors of patent applications 20100309147 and 20100309148. They're also present on the third patent application 20100313125 which also includes fellow engineer Reginald Hudson. All three patents were published in Q3 2009.
Patent: Contextual Voice Commands
One of Apple's Life on the Go Features for the iPhone is about Voice Memos. Apple's webpage states "Capture a thought, a memo, a meeting, or any audio recording with Voice Memos. When you're done, edit your recording – then send it via email or MMS." Voice commands are also part of the iPod touch to control your music. In Apple's latest patent on the subject, we're able to see that they're thinking of expanding voice commands into apps like video games, Safari, GPS and more. Unfortunately Apple doesn't provide us with video gaming examples; the one that most of us would like to know more about.
Apple's patent discusses example processes for implementing a best assumption or learning model. Examples of learning models could include but are not limited to: machine learning models, such as support vector machines (SVM), an inductive inference model, concept learning, decision tree learning, Bayesian learning, and others. A machine learning model could be used to develop a process to teach the iPhone to improve its performance based on accumulated data received through an input unit or stored in a database. Such machine learning models could be used to automatically produce a desired result based on rules and patterns designed from the accumulated data.
In Apple's patent FIG. 8b we see an iPhone accessing a knowledge base to identify the best assumption or most likely choice of what the user intended. The knowledge base could include a local knowledge base 812 stored within the iPhone and an external knowledge base 814 located external to the data processing device 102. For example, the iPhone could access the external knowledge base stored on the net.
It has been said that Apple is reserving the instruction "Beam Me Up" for their 2030 iPhone. Okay, I just wanted to know if you were still paying attention – ha!
Apple credits Marcel Van Os, Gregory Novick and Scott Herz as the inventors of patent application 20100312547, originally filed in Q2 2009.
Another Noteworthy Patent Application Published Today
A 2008 patent of Apple's surfaced again under continuation patent 20100312920. Although we covered Apple's granted patent on this subject under 7,589,629 in September 2009, there were a few new images worth noting as shown below.
The heart of the patent is that Apple has a method of determining whether a returned product, or one under warranty, is a legitimate claim or not. More importantly, they could determine if the damage was brought about by consumer abuse. Althought the patent's illustrations are showing that of an iPod or iPhone, the fact is that Apple's detection system applies to all of their hardware including the iMac, Mac Pro, MacBook and now, even the iPad. This is likely one of the reasons for the patents' continuation. The patent is quite the detailed overview of how they determine abuse, if this holds any interest for you.
Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for further details. For additional information on any patent reviewed here today, simply feed the individual patent number(s) noted in this report into this search engine. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.
EB, just click and hold the middle clicker button, if you have your headphones in. When the Voice Command interface chirps at you, ask it what "What time is it?", and it will tell you.
Posted by: RBM | December 10, 2010 at 11:23 AM
@EB-"What time is it?" is and has been a voice command.
Posted by: AP | December 10, 2010 at 11:19 AM
Would they just add "What time is it?" already? I can't tell you how much I hate to have to pull my phone out of my pocket and turn the screen on when I just want to squeeze my mic and say that. Seems so simple.
Posted by: EB | December 09, 2010 at 03:09 PM