Apple Sheds More Light on their iPen & Graphics Program
Apple Introduces us to Patent Pending Safari 3D

Apple Points to Writing on Future iPad with Optical iPen

1. Apple Points to Writing on Future iPad with Optical Pen
On May 24, 2012, the US Patent & Trademark Office published two major patent applications from Apple relating to a future iPen. In our first report published this morning titled "Apple Sheds More Light on their iPen & Graphics Program," we covered Apple's push into specialized haptics for a future iPen. In our second report, we focus on Apple's consideration of using an optical based iPen. The unique angle taken by Apple's optical pen is a fascinating approach to determine a pen's location on a tablet surface. One of the secrets utilized in this approach uses invisible indicia.  

 

Apple Invents an Optical Stylus

 

Apple's patent relates to a stylus that is provided with an optical sensor, such as a camera, that is used in determining a location and movement of the stylus relative to a touch screen display of a computing device and/or tablet. The optical stylus may be configured to transmit the location and movement to the computing device. In some embodiments, the optical stylus may be configured to process and/or filter the location and movement information prior to transmission, whereas in other embodiments, raw data may be transmitted.

 

On-Screen iPen Positioning via Invisible Indicia

 

In some embodiments, the relative position of the optical stylus may be determined based on indicia detectable by the optical stylus. The indicia may further be used in determining the movement of the optical stylus. The indicia may include pixel dependent indicia that are communicated via the pixels displayed by the touch screen or physical or permanent indicia that are physically present on or in the screen or otherwise positioned such that the optical stylus may detect them. Generally, the indicia are imperceptible to the human eye. As such, the touch screen may be encoded without diminishing or otherwise interfering with images displayed on the touch screen.

 

Writing on an iPad

 

Finally, Apple presents a simple case for an optical stylus that will allow for position-based sense writing on a touch screen display. The contact of the optical stylus with the touch screen may be differentiated from input from a hand or finger. The optical stylus works in addition to capacitive sensing, may provide sub-millimeter scale resolution and, in some embodiments, may have a pointed tip to provide precision input. Additionally, the stylus operates independent of an orientation or rotation of the stylus and/or screen. In some embodiments, the stylus may be pressure sensitive and may communicate data to a host device wirelessly.

 

Apple states that the optical stylus is provided with an optical sensor or camera that may determine a relative location, angle and/or movement of the stylus with respect to a display a computing device. The optical stylus may be configured to transmit the location, angle and movement to the computing device.

 

An Overview of the Optical iPen

 

In Patent figure 1 below we're able to see one end cover 14 which is configured to allow for communicative transmissions to a host device, while the other end cover 16 is configured to contact the host device such as a tablet.

 

The body of the optical iPen could be made of a suitable material such as a metal, alloy, plastic, composite, or other suitable material. Specifically, the end cover is configured to allow for communicative transmissions via a radio frequency (RF) transparent material, such as a plastic in some embodiments. In other embodiments, the end cover 14 could be configured as a signal diffuser to diffuse an infrared (IR) signal, for example. The end cover 16 may be made of a plastic material or other material that will generally not be likely to scratch or damage a display screen of the tablet.

 

The iPen's Camera

 

Apple's patent FIG. 2 illustrates an example cross-sectional view of the stylus 10 of FIG. 1. As shown in this example, a camera 18 may be positioned within the body 12 of the stylus. The camera may be a suitable light sensor for the purposes described in the invention. In some embodiments, the camera includes a charge-coupled device (CCD) sensor. Additionally, in some embodiments, a light source 20 may be provided and include one or more light emitting diodes. In some embodiments, the light source may emit in a non-visible portion of the electromagnetic spectrum, such as the IR spectrum.

 

2. Overview of the Optical iPen System

 

In some embodiments, a pressure sensor 22 may be located under, adjacent to or coupled with the end cover 16.

 

In some embodiments, the pressure sensor may serve as a trigger for the capture, processing and/or transmission of location indicative images by the camera of the stylus. That is, the camera in the iPen may take images of the pattern(s) on the touch screen in response to the pressure sensor registering an indication of pressure being applied to the tip of the iPen.

 

3. Another block diagram of the Optical iPen System

 

Apple's Fascinating Methodology

 

How the iPen may actually function with an iPad is simply fascinating. According to Apple, a tablet like the iPad may have a touch sensitive display that provides a visual output to a user and may receive input from the user. Generally, the touch sensitive display doesn't detect contact by the iPen. In particular, contact by the iPen on the surface of the display may not register as an input to the iPad. Rather, the iPen may provide input to the tablet via a wireless communication channel, such as an RF channel, an IR channel, or the like.

 

As mentioned earlier in the report, patterns encoded on the surface of the iPad's display or generated by the display may be used to determine the location of any contact the iPen makes with the display. In particular, the iPen may capture images of the encoded patterns to either determine its relative location on the display or may transmit image data to the tablet so that the device may determine the location of the stylus contact.

 

A pattern, design and/or code may be provided that is not visible to a human eye, but may be captured by the camera of the iPen to determine the relative location of the iPen relative to the tablet's display. In some embodiments, the pattern may take the form of a QR code, a 2D code, a bar code and/or one or more other suitable encoding format – see patent FIG. 6 below.

 

4. Encoded Patterns & the iPen's Ability to Emit a Monochrome light

 

In some embodiments, the pattern may be physically present in a layer positioned on or over the display. For example, the pattern may be created by lasers making dots or surface blemishes below a certain size that cannot be seen (e.g., approximately five microns in diameter). In some embodiments, an acid, chemical etch, or chemical deposition may be used to create the pattern.

 

In still other embodiments, the pattern may be created in a certain wavelength of light imperceptible to humans and the camera in the iPen that may be configured to capture light in a range that includes that wavelength. For example, the patterns may be printed with IR ink on glass. In some embodiments, the pattern may be created using a micropolarizer and the camera may have a polarizing filter to detect the pattern. For yet other embodiments, the polarization may include using patterned birefringent Indium-tin oxide (ITO) layer. In some embodiments, near field optics may be implemented to encode the display surface. In particular, small focal length optics near the pixels may be used.

 

Apple's patent FIG. 9 above illustrates a side-view of the optical iPen of FIG. 1 with the host computing system of FIG. 4. And finally in patent FIG. 10, Apple illustrates that an iPen could emit a monochromatic light directed at the surface of the tablet having a birefringent ITO that reflects light back toward the stylus. The reflected light is refracted so that it has a different wavelength than the monochromatic light that was transmitted. The wavelengths of light that are captured by the iPen may be used to determine the distance of the stylus from the surface of the tablet to that of the iPen relative to the tablet.

 

Patent Credits

 

Apple's patent application was originally filed in Q4 2010 by inventors David Amm, David Simon and Omar Leung and published today by the US Patent and Trademark Office.

 

Note to Referring Sites: We ask that referring sites limit the use of our graphics to a maximum of two per report. Thank you for your cooperation.

 

Notice

Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. Apple's patent applications have provided the Mac community with a clear heads-up on some of Apple's greatest product trends including the iPod, iPhone, iPad, iOS cameras, LED displays, iCloud services for iTunes and more. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.

 

T6 - Patent Bolt  Current Promo
 

Check out Our Latest Report on Patent Bolt Titled:

Microsoft Reveals Futuristic 3D Virtual HoloDesk Patent

 

Comments

The comments to this entry are closed.