On May 24, 2012, the US Patent & Trademark Office published a patent application from Apple that sheds more light on their future iPen and related graphics program. Apple continues to spend time and R&D funding on a future iPen device that's being designed to work with all of Apple's tablet-styled iDevices. The focus found in today's invention is twofold. Firstly, Apple is thinking of adding advanced haptics to the iPen so that the end user will be able to feel brush strokes and/or line thicknesses for example. Secondly, Apple is designing the iPen with a built-in mini speaker so as to provide users with various forms of audio feedback. To make all of this interesting and relevant, Apple sheds a little light on how their iPen will work with either a new graphics/paint program of their own and/or with known apps such as Autodesk and Microsoft's Paint.
Apple's Patent Background
Existing touch-based user interface devices typically have a touch panel and a visual display component. The touch panel may include a touch sensitive surface that, in response to detecting a touch event, generates a signal that could be processed and utilized by other components of an electronic device. The touch sensitive surface may be separate from the display component, such as in the case of a trackpad, or may be integrated into or positioned in front a display screen, such as in the case of a display touch screen.
Display touch screens may show textual and/or graphical display elements representing selectable virtual buttons or icons, and the touch sensitive surface may allow a user to navigate the content displayed on the display screen. Typically, a user may move one or more objects, such as a finger, a stylus, across the touch sensitive surface in a pattern that the device translates into an input command. As an example, some electronic devices allow the user to select a virtual button by tapping a portion of the touch sensitive surface corresponding to the virtual button. Some electronic devices may even detect more than one simultaneous touch events in different locations on the touch screen.
Generally, input devices don't provide haptic feedback to a user in response to interactions with the input device. The user could typically only feel the rigid surface of the touch screen, making it difficult to find icons, hyperlinks, text boxes, or other user-selectable input elements on the display. An input device capable of generating haptic feedback may help a user navigate content displayed on the display screen, and may further serve to enhance the content of various applications by creating a more appealing and realistic user interface. "Haptic feedback" may be any tactile feedback. Examples include forces, vibrations, and/or motions that may be sensed by the user.
Apple's invention generally relates to haptic input devices that could receive an input from a user and provide haptic feedback based on the input from the user. In some embodiments, the haptic input device may be configured to interface with a touch-based user interface device, such as a touch screen. The touch-based user interface device may further include one or more input sensors, such as force sensors or position sensors that are configured to sense one or more characteristics of a haptic input device as it engages the touch screen.
For example, the one or more characteristics may include a position of the device relative to the touch screen, a pressure being applied on the touch screen surface by the haptic input device, an angle of the input device relative to the touch screen, and the like. The touch-based user interface device may determine a haptic response based on the one or more characteristics and transmit the haptic response to the haptic input device. The haptic input device may include a haptic actuator that generates haptic feedback based on the received haptic response. The haptic response may take the form of a control signal that drives a haptic actuator or a look-up value that corresponds to a control signal stored in a look-up table. In some embodiments, the haptic input device may also include additional sensors configured to sense one or more characteristics of the haptic input device, such as the orientation of the haptic input device, the acceleration of the device relative to the touch screen surface, and so on.
Apple iPen: Haptic Input Device
Apple states that one embodiment of the invention may take the form of a haptic input device that includes: a receiver configured to receive a first signal from a touch-based user interface device; a decoder coupled to the receiver and configured to extract an input signal from the first signal; a controller coupled to the decoder and configured to receive the input signal from the decoder, further configured to generate a control signal based on the input signal; a haptic actuator coupled to the controller and configured to actuate in response to the input signal; at least one sensor configured to determine at least one characteristic of the haptic input device; a transmitter coupled to the at least one sensor.
Apple's patent FIG. 1 illustrates a tablet or iPad incorporating a haptic input device 101. The touch-based user interface device 103 may include a touch screen surface 105 and one or more transmitters 107 configured to wirelessly transmit signals to a receiver of the iPen.
The iPen may include a tapered or pointed tip that is configured to contact the touch screen surface. The tip may be capacitive in order to permit registration of the contact on the touch screen surface. The iPen may alternatively have a blunt, tip or may take the form of a ball.
The iPen may be configured to provide haptic feedback to a user. This haptic feedback may be any type of tactile feedback that takes advantage of a user's sense of touch and/or sight, for example, by creating forces, vibrations, and/or motions that may be perceived by the user.
For example, haptic feedback may confirm the user's selection of a particular item, such as a virtual icon or a button, or may be provided when the user's iPen is positioned over a selectable item. The iPen may also provide a haptic output when the device is over, near or passes the boundary of a window or application shown on a display, or when the device is over, near or passes a graphic item having a particular texture.
The iPen to Work with iPhone, iPad & beyond
Some of the examples of touch-based user interface devices incorporating touch screen surfaces and future haptics include Apple's iPhone, iPad, iPod touch and beyond. The touch-based user interface device 103 senses various touch-based input gestures, such as swiping, taping, scrolling, and so on, applied across the touch screen surface 105. The input sensors may include one or more capacitive sensors, optical sensors, acoustic sensors, force sensors, and so on. Touch-based input may be applied by moving the iPen.
For example, a tap onto the touch screen surface may be associated with a selection, while sliding the object along the touch screen surface in a particular manner may be associated with scrolling, enlarging, shrinking, and so on. In some embodiments, a combination of gestures by a finger or other object and the haptic input device may be interpreted together to provide a particular haptic feedback to a user through the iPen.
The iPen will Work with Apple's Own Graphics Program or Others
Another aspect of Apple's invention includes position-based feedback. The iPen may provide haptic, audible and/or visual feedback when the device passes over a selectable button or icon being displayed by the touch screen. For example, in one embodiment, the processing device of the touch-based user interface device 103 may run a graphics editing program that allows the user to create an image by moving the iPen across the touch screen to manipulate a cursor to draw or otherwise interact with graphical elements.
In this embodiment, the iPen may be configured to provide haptic/visual/audible feedback when the user selects the drawing cursor using the iPen. The graphics editing program may be similar to various commercial off-the-shelf programs, such as, Autodesk, Inc.'s SketchBook, KikiPixel's Inspire Pro, Microsoft Corporation's MS Paint, and so on. Apple has provided a number of hints regarding a future graphics and /or paint program over the years (one, two) using the magic mouse. Extending its functionality to tablets via the evolution of the iPen would only be natural.
In other embodiments, the iPen may be configured to provide haptic, audible and/or visual feedback based the amount of pressure applied by the iPen to the touch screen surface. In one embodiment, the haptic sensitive tablet may provide haptic, audible and/or visual feedback when pressure applied to the user input device by the haptic input device exceeds a predetermined threshold. In other embodiments, the iPen may be configured to provide haptic, audible and/or visual feedback if the iPen and/or touch-based tablet detects any pressure being applied onto the surface regardless of the amount of pressure being applied.
With respect to one embodiment in which the touch-based tablet is running a graphics editing program, the iPen may allow the user to "draw" an image only if the touch-based tablet and/or the iPen determine that the user is applying sufficient pressure onto the touch screen via the iPen.
With respect to embodiments in which the tablet is running a graphics editing program, the iPen may allow the user to "draw" an image only if it's positioned over a "paintable" portion of the touch screen surface. Certain embodiments may also provide haptic feedback that varies with a distance to a user interface element, such as a selectable icon and the like.
As one example of how an output may be adjusted, the width of the line created by the graphics editing program may adjusted according to the tilt of the haptic input device 101 relative to the touch screen 105 to simulate writing with a calligraphy pen or painting with a paint brush. Additionally, the angle and/or thickness of the line may be adjusted according to the tilt of the haptic input device 101 relative to the touch screen 105, with a higher tilt corresponding to the creation of a more slanted, thicker or angled line, for example. (Alternative embodiments may vary the effect of the haptic input device's tilt angle on an output generated by the user input device.) Thus, a single haptic device may be used to create a line of varying thickness or depth of color in a single stroke, or another output that varies physically or temporally in response to changes in pressure, capacitance, angle and the like during a continuous input.
Apple notes that the iPen may have one or more orientation sensors, such as a multi-axis accelerometer, that may determine the axial orientation of the haptic device. Thus, the orientation sensor may detect when the iPen rotates and a line may be made thicker or thinner, darker or lighter in a graphics program by this action. As another example, rotating the iPen in one direction may increase an audio volume from an associated device, while rotating the haptic device in another direction may decrease the audio volume.
The iPen's Speaker
In another embodiment, the iPen may further include an optional audio transmitter, such as a speaker, that is communicatively coupled to the controller 113. The controller may transmit control commands to the speaker based on information received from the sensors 116 and/or the one or more transmitters on the touch-based tablet. The output of the speaker may vary based on the activity being simulated, as well as the user's manipulation of the iPen. For example, in one embodiment, the speaker may simulate the sound of moving a pen or a paintbrush across a piece of paper or a canvas, with the speaker emitting different sounds for emulating a pen or a paintbrush. In another embodiment, the volume of the speaker may be adjusted based on the amount of pressure being applied to the touch screen surface 105. For example, the volume may be gradually increased as the input device 101 applies more pressure to the touch screen surface 105. In other embodiments, the volume and/or sound may be adjusted according to the position of the input device 101 relative to the touch screen surface 105.
The Basic iPen System
In Apple's patent FIG. 2 shown below we see one embodiment of a touch-based user input device that could be used in conjunction with the iPen system. As shown in FIG. 2, we see a processing device that may be any known processing device, including, but not limited to, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a microcontroller, a graphics processing unit (GPU), software or firmware configured to execute instructions in memory to perform various processing functions, and so on and so forth.
In one embodiment, the storage device 162 may store operating system software that includes a set of instructions that are executable on the processing device. The operating system software may also provide a menu-based operating system that could be navigated by the user through a graphical user interface displayed or presented to the user on their tablet's touch screen.
The Tip of Apple's iPen
As shown in patent FIG. 3 below, the iPen may include one or more engagement portions or tips configured to contact (and to register contact on) the touch screen surface, a receiver 119, a decoder 112, a controller 113, one or more haptic actuators 114, one or more optional sensors 116 configured to sense various characteristics of the user's manipulation of the iPen, and an optional transmitter 118.
As shown in FIG. 3, the iPen may also include a power source 115 configured to supply power to the controller and/or the haptic actuator. The power source may be a battery or some other type of power supply, such as a power adapter, an electromechanical system such as a generator or an alternator, a solar power cell, and so on and so forth.
In one embodiment, the tip may be formed from a conductive material, such as metal, or from a non-metallic conductive material, such as graphite, various salts, plasmas, and so on. In other embodiments, the tip may be formed from a nonconductive material. The tip may include a portion configured to contact the touch screen surface which may be pointed or blunt. In another embodiment, the tip may be configured as a ball that is configured to roll along the touch screen surface so that different portions of the tip may contact the touch screen surface.
The tip may be communicatively coupled to a receiver 119. The receiver may be any type of wireless or wired receiver that is configured to receive signals from the one or more transmitters 107 of the touch-based tablet. The wireless signals may be transmitted using any type of wireless transmission medium, including, but not limited to, Wi-Fi, Bluetooth, IR, RF, and so on and so forth.
Simple Overview of One iPen Operation Scenario
Apple's patent FIG. 7 is a schematic diagram one possible operation of the iPen system. As shown in FIG. 7, the touch-based tablet may run an operating system supporting one or more software or firmware applications. For example, an active application may be a game, a graphics editing program, a word processing program, and so on.
Haptics for Multiplayer Gaming
And lastly, Apple points to a gaming scenario. Apple states that in one embodiment it may permit communication between multiple haptic input devices and/or multiple user input devices. For example, four people may each have their own haptic input device and touch screen/user input device. Each user action with his or her user input device may cause one or more of the other persons' haptic input devices to produce a haptic output. Such embodiments may also be useful in multi-person activities such as gaming.
Apple's patent application was originally filed in Q4 2010 by inventors Aleksandar Pance and Omar Leung and published by the US Patent and Trademark Office today.
Also see a Secondary Report on this Topic Titled: "Apple Points to Writing on Future iPad with Optical iPen
Note to Referring Sites: We ask that referring sites limit the use of our graphics to a maximum of two per report. Thank you for your cooperation.
Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. Apple's patent applications have provided the Mac community with a clear heads-up on some of Apple's greatest product trends including the iPod, iPhone, iPad, iOS cameras, LED displays, iCloud services for iTunes and more. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.
Check out Our Latest Report on Patent Bolt Titled:
Sites Covering our Original Report
MacSurfer, Twitter, Facebook, Apple Investor News, Google Reader, Macnews, Graphics Software Updates, iGeneration France, Now News China, iPhone World Canada, Vision2Mobile, Ameblo Japan, Techmeme, MacDailyNews, MacTechNews Germany, Wall St. Cheat Sheet, CNET, Business Insider, CNET Update Video, TechOrange China, Actualidad Spanish, Mobilsiden Denmark, CNET France, phones review UK, CNET Japan, MacWorld UK, Gizmodo UK, Yahoo Taiwan, CG Channel, CNET UK, Techline Hungary, Cult of Mac, , Apple Caffé Italy, PCWorld, MyApp Taiwan, Applesfera Spanish, Italiamac Italy, SmartOffice Australia, T3 UK, Inforbae Spanish, Informaticien Belgium, SmartHouse Australia, The iPad Fan, and more.
Join in the Conversation!
The sites that we link to above offer you an avenue to make your comments about this report in other languages. These great community sites also provide our guests with varying takes on Apple's latest invention. Whether they're pro or con, you may find them to be interesting, fun or feisty. If you have the time, join in!