Apple Wins Patents for iPod Classic Retro Phone, IM Chat & iPod UI
You Might Need a Secret Decoder Ring to Log on to Future Macs

Cool New Finger Swiping Camera Controls coming to iPhone & iPad

1- cover swipe finger gesture - future iPhone 
On Feb 25, 2010, the US Patent & Trademark Office published a patent application from Apple that reveals one of the next chapters for Apple's iPhone. Today's patent reveals yet another innovative concept that is designed to help users control their incoming calls and voicemail by simply swiping their finger over the external camera lens. It will control rewinding and fast forwarding voicemail. In addition, the new methodology will also enhance one handed navigation of web pages, documents, a contact list or your iTunes library by simply swiping the camera lens in different swiping motion combinations. In the future, the iPad may be able to take advantage of this feature if the camera is positioned correctly. This would theoretically allow a user to simply flick a finger over the camera lens to turn the page of a book or scroll a webpage without ever having to move your hand. This is an excellent idea on several fronts that will have Apple's competition on the run, again.


Control iPhone Calls & Voicemail via Camera Finger-Swipes


Apple's patent FIG. 1 shows an iPhone 100 having a built-in digital camera in use for a telephone call or voicemail retrieval. The device technically could be any portable multi-functional portable device (iPod touch, iPad) with a built-in digital camera – but for the sake of clarity and simplicity, the following text will focus on device 100 only being an iPhone.



The built-in digital camera includes a lens 105 located on the back face of the iPhone can capture digital images of a scene that is before the lens and process those images and also detect finger swipes across the lens.


As seen in FIG. 1, a user is holding his iPhone to his ear so that he can hear the voices in a call or voicemail message. The user could easily reach the lens with a finger of the same hand that holds the iPhone. A voicemail command can be controlled by a swipe of the finger in the direction of the arrow 120 over the lens.


For example, during access of a voice mailbox, the user may swipe his finger in the direction of the arrow 120 over the lens to rewind the playback of a voicemail message. The user may swipe his finger over the lens in the direction opposite the arrow to fast forward the playback of a voicemail message.


In general, there could be one or more different directional swipes defined as corresponding to respective voicemail commands. Note that the user need not actually touch the camera lens when sliding his finger across the lens.


Alternative Tap Control: In another embodiment of the invention, the user could use his finger to tap the iPhone so as to pause playback of a voicemail message, or stop the rewind or fast forward of a message. While voicemail message playback is paused, the user may tap their iPhone to resume playback of the voicemail message. These voicemail functions can be performed without having to move the iPhone from its position over the user's ear as depicted in FIG. 1, to in front of the user's face. Therefore, the user is able to control voicemail playback while their iPhone is up against their ear, and without having to look for any buttons. Also, only one hand is required to implement these functions.


A similar concept may be applied to control call functions during a telephone call conversation with a called or calling party. The motions described above, namely sliding a finger in various directions across the lens and/or tapping the iPhone could also control call functions, such as merging multiple calls, setting a call on hold/unhold, and switching between or among multiple simultaneous calls. The setting of these motions to correspond with call control or voicemail control commands may be put into place by Apple or customized by the user.


Control iPhone Display Navigation via Camera Finger-Swipes


Apple's patent FIG. 2 depicts another use for camera finger-swipes, this time for navigating the iPhones interface. The user holds their iPhone in the palm of their hand so that they could view the display screen 200. To scroll or navigate a handle location or to move the view of the display screen, the user moves their finger over the camera lens, in a direction analogous to the one that they wish to move the handle of the screen. FIG. 2 depicts the user using their index finger to swipe across the camera lens, but depending on the hand that is holding the device (i.e., the right or left hand) and the location of the camera lens, a user may choose to use another finger, such as his middle or ring finger, if it would be more comfortable.


3 - Control iPhone Navigation via Finger Swiping Camera 

Consider the following example of navigating the display screen. If the user wishes to scroll down on a webpage or text document, then the user would simply move their finger across the camera lens in an upward direction (i.e., towards the top of the screen 200). This would be consistent with moving the page "up" so as to bring a bottom portion of the page into view. To move the page down (and thereby bring a top portion of the page into view), the reverse needs to occur, i.e., the user needs to swipe across the lens in a downward direction. Note that navigation on the display screen (using a finger swipe across the camera lens) need not be limited to straight up and down, but rather could be performed in other or additional directions (e.g. left and right). Now that Apple is introducing "iBook" – think of flipping the page of a book using this method so that you don't even have to move your hands from the iPhone or future camera based iPad.


In another embodiment, the finger swipe over a camera lens corresponds to the movement of one or more scroll bars for the page on the screen. For example, swiping a finger in the downwards direction would move an up-down scroll bar downwards, resulting in bringing the bottom portion of a page into view. Likewise, swiping a finger in the left or right direction would move a left-right scroll bar in the left or right direction, respectively (causing the left or right portion of the page into view, respectively).


Overview of the Finger Swiping System


Apple's patent FIG. 3A is a block diagram of an iPhone and/or a portable handheld device 100 showing several of its components that enable the enhanced telephone and voicemail capabilities described above.



The Gesture Mapper: One of the most interesting new components for this future iPhone shown in FIG. 3A above is the gesture mapper 328 which translates the direction of motion into a predefined voicemail or call control command that is then passed to a telephony component 340. The gesture mapper may translate the direction of motion into a command, by looking up the direction of motion in a database or list and identifying the corresponding command. The telephone component 340 then implements the command, e.g., invokes a call feature during an ongoing phone call, or a voicemail review feature.


The iPhone/device 100 may be trained in an explicit or implicit manner by capturing and storing (in a library), groups of image sequences of different instances of the same swipe by the user, and then analyzing the groups of image sequences to learn the variations of what is otherwise the same finger swipe by the user. Such training may improve the accuracy of future attempts by the finger swipe detector, by adjusting to particular motions that a specific user repeats.


Tap Detector: The second interesting component shown in patent FIG. 3A relates to the new tap detector. When a user taps the phone with his finger or other similarly sized object (e.g., a pen or stylus), an accelerometer 330 detects the tapping and provides such information to a tap detector 332. The tap detector 332 analyzes the tapping and determines the type of tapping. For example, the tap detector 332 may determine that a single or double tap was made. The tap detector 332 transmits this information to the gesture mapper 328. The gesture mapper 328 translates the detected tapping information into its corresponding command that relates to a call feature or a voicemail review feature (the gesture mapper 328 may use the same lookup routine described above). For example, a single tap may translate into merging two calls and a double tap (two consecutive taps, one quickly following the other, for example) may translate into putting a current call on hold and answering another call on hold. The gesture mapper 328 sends the resulting command (e.g., merge two calls) to the telephony component 340 which implements the command.


Enhanced Document Navigation


Apple's patent FIG. 4 is a block diagram covering enhanced document navigating capabilities. Rather than touching a touch screen or pressing/scrolling buttons to navigate the view in a display, a user could now move their finger over the camera lens of the device in the direction that they wish to move the view of the display screen 200.


In this embodiment, the camera component 320 and finger swipe detector 326 may work in the same manner described above in connection with FIG. 3A, to detect the user's finger swipe motions across the camera lens. The gesture mapper 328 in this case translates the detected motion into a graphical user interface command, i.e., a corresponding movement of a handle location or scrolling operation for the display screen/document. The movement of the user's finger may be proportional to the movement of the handle location or scrolling operation of the screen. A user interface component 410 implements the requested movement of the handle location or scrolling operation on a display screen. Thus, a user could easily and quickly access and navigate all areas of the screen with the same hand that holds the device.


While the emphasis in the examples above relate to a document, the method will also relate to navigating web pages, an iTunes list, contact list etc.


Camera Finger Swiping Flow Charts

5 - System Flow Chart & Architecture 

For the record, Apple does refer to a patent FIG. 7 covering a solid state image sensor but doesn't actually provide the graphic.


Apple credits Chad Seguin, Justin Gregg and Michael Lee as the inventors of patent application 20100048241, originally filed in Q3 2008.


Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application and/or grant is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application and/or grant should be read in its entirety for further details. For additional information on any patent reviewed here today, simply feed the individual patent number(s) noted in this report into this search engine.



As an engineer I like to keep myself updated,and as the 3d gesture recognition of project Natal has been in the news for quite some time this idea didn't seem anything new to me.

Another point to be noted -the new Xbox 360 3d gesture recognition has an additional laser depth scanner in addition to stereoscopic cam arrangement to facilitate nontrivial gesture gesture recognition. Even then it consumes 10% of the cpu resource of xbox360 as we all know DIP and Computer Vision are processor intensive workloads. A 10% load for the Xenon cpu means a considerable overhead for any smartphone cpu. Unless facilitated by a separate processor this sensing-visualization feedback loop may bring the phone down to its knees even in the most trivial scenarios. And additional processor means less battery life.

So its going to be interesting to see how the problem is tackled in a real life product.

while the concept of this is really clever i'm more worried about battery life here. won't this whole idea just drain the battery? i suppose the camera has to be on the whole time to "capture" the gestures.

To the Redmond Fanboy: No, I don't see a resemblance to anything Redmond. Ha!

Guys,before hailing all your praises over the idea don't you think that the concept has an uncanny resemblance to a certain project Natal that has been developed by a company in Redmond for its game console?

I like how Apple is all about trying different concepts. I need to make the switch from blackberry to iphone for that reason alone.

I currently do not have an iphone. I was going to go out today and upgrade to it. Anybody know when a new iphone will be coming out? I'm afraid to go out and get one then only for a new one to be released. That seems to be my luck. If not for bad luck, I'd have no luck at all. If one is not due to come out in the near future, then I'll go get the 3gs. However, if something newer will be released, I could wait. Any ideas??? Thanks.

Shall we predict that the new design will have a bigger camera surface area ? I think the present size of the camera is too small to have multiple slide detections.

Michael, while your observation is noted, however FIG .2 is showing a rendering of the finger behind the phone itself. Also take a look at your phone now (if it has a camera) and you will notice that the camera is on the back left side which is standard. This is because if you hold your phone to your ear you will notice that your finger is always on the top right of the phone no matter what hand you use. This is because the idea is to keep the oil from your fingers of the lens as much as possible. So the question now becomes how Apple has decided to implemte the concept of gestures into a usage and second nature process.

Although a little off topic, you make a good point Steve. I'm a fan of useful features myself...and we sure get a lot of them from Apple.

Maybe someone at Apple will read your comment and get cracking on that Steve. You never know.


Hey Michael, I was able to do the swipe perfectly fine with my thumb. I don't know if I'm extraordinarily unusual or what, but cheers.

seems like a waste to me. I rather them work it out so that when you are reading an email and select an embedded web link that when you close the webpage, it automatically returns you to your email. Having to select email again seems repetitive? These are USEFUL features.

This is apple...Take a simple concept which was never thought of before.... make it into an extra ordinary feature....

No words.... I am dumb-founded.

FWIW, I generally hold my phone with my left hand, leaving my right hand free to write things down or use my computer. The camera is in the right place for a left hand index finger.

That's an interesting observation, Michael. In all practicality, your right and I'm sure Apple must have thought this out. If this actually comes to market, I'm sure that Apple will realize the camera has to be moved. In fact, a few recent patents talk about video calling and point to a camera that is centered on the phone. Lastly, you only have to wave your finger over the camera, not touch it. So a center positioned camera would work. But for now, Mike, you found a flaw in the graphic. Good catch!


Interesting idea. But the camera is on the wrong side of the phone for this to be comfortable for right-handed users. Hold your phone in your palm and try it out. Indeed, it's not physically possible to hold the phone in your palm as shown in the drawings above. Your hands don't bend like that.

Amazing, love how Apple breaks all limitations and boundaries. As cliche as it may sound " Think Different" is definitely a mantra that Apple lives up to.

The comments to this entry are closed.