Apple Granted a Patent for Samsung-Like Beam eDoc Transfers
On April 9, 2013, the US Patent & Trademark Office published a new granted patent from Apple that reveals a new feature that has yet to come to market. As such, Patently Apple will present this granted patent in the form a new patent application report that goes deeper into the technology presented. Apple has been razzed for a while now for not having a technology for the iPhone that is similar to Samsung's Beam that lets users share large files from one smartphone to another using NFC. Apple's newly granted patent provides such an interactions and so much more. Apple's solution works between a Mac and iPhone and provides editing software to crop, scale and adjust images. Apple had the technology mapped out in January 2010, or about 30 months prior to Samsung's public release of this feature for the Galaxy SIII.
Apple's Patent Background
Powered by recent advances in digital media technology, there is a rapid increase in variety of different ways of interacting with digital media content, such as images (e.g., photos), text, audio items (e.g., audio files, including music or songs), or videos (e.g., movies). In the past, consumers were constrained to interacting with digital media content on their desktop or in their living room in their home. Today, portability lets people enjoy digital media content at any time and in any place. Further, today e-mail may be sent from anywhere, to anywhere.
While such portability of media content provides some advantages, some challenges still remain. Interacting with portable media content may be limited; or may require too many manual steps for users; or may not always be intuitive, easy or convenient for users.
Thus, there is a need for improved techniques for interacting with media content.
Insertion or Importation of Media Content into Electronic Documents
Apple's invention pertains to improved techniques for interacting with one or more "handheld carriers" which Apple later translates as being iDevices such an iPhone or iPad kind of device.
The media activity provided by the computing device may involve creating or editing an electronic document. The integration of the media content into operation of the media activity may involve insertion or importation of the media content into the electronic document.
The media content can be digital media content, such as images (e.g., photos), text, audio items (e.g., audio files, including music or songs), or videos (e.g., movies).
Technically speaking, the invention can be implemented in various different embodiments. According to one embodiment, an electronic image resident on a portable electronic device can be transferred and inserted into an existing electronic document in use on a computing device. According to another embodiment, a hand-drawn diagram can be inserted into an existing electronic document in use on a computing device.
Apple's patent FIG. 1 shown below is a block diagram of a system (#100) for interacting with an iDevice (#110) hosting media content. The system can also include a non-handheld base unit (#130) such as an iMac or MacBook that has one or more sensors (#150). The non-handheld base (or Mac) unit can interact with the iDevice, typically in a wireless manner. The Mac can include a media application framework (#160). The media application framework may be employed for providing media application services and/or functionality to a plurality of media activities (#122). The media application framework may control the plurality of media activities.
A user interface (#120) may be provided by the iDevice for controlling the operation of one or more of a plurality of media activities. In various different media activities, using the user interface 120, a user may experience and manipulate media content in various different ways, or may experience and manipulate media content of different types or various combinations.
As shown in FIG. 1 below, the media application framework may be coupled with the one or more sensors for integrating at least the portion of the media content (#112) into the operation of at least one of the plurality of media activities, upon recognizing the media content and at least one of the media activities. The media application framework may also include or be coupled to control logic (#140). The control logic can be coupled to the one or more sensors. Further, the control logic may be coupled with the one or more sensors for media content recognition (using a media content recognition component #142), media activity recognition (using a media activity recognition component #144) or media content integration (using a media content integration component #146). In particular, the control logic may be coupled with the one or more sensors for integrating at least a portion of the media content into the operation of the media activity, upon recognizing the media activity and the media content.
About the New Sensors
Apple's patent FIG. 2 noted above illustrates a block diagram of several examples of sensors that may comprise one or more of a proximity sensor (#202), near field communication electronics (#204), short range wireless electronics (# 206 Bluetooth), or an optical device (#212).
The sensors may comprise a software sensor (#216) for sensing the media content of the iDevice that may have an active window that highlights particular media content (e.g., a photograph from a photograph library or a photograph that was taken by a camera or camera functionality of the iDevice. One or more of the software sensors may sense particular or highlighted media content of the iDevice or may sense an active window having media content.
Further, the sensors may comprise a software sensor for sensing the media activity, or an active display window of the media activity. For example, an editing software sensor may be employed for sensing a media editing activity. An editing software sensor may be for sensing electronic document editing activity, or for sensing where the media content is to be inserted in an electronic document.
The Associated eDocument Editing User Interface
Apple's patent FIG. 8 shown below is an exemplary screenshot display of a user interface controlling a media activity. The user interface can comprise an electronic document editing interface for controlling operation of an electronic document editing activity. For example, an electronic document (e.g., email message) may be edited as shown in FIG. 8.
Media content, for example, a hand-drawn picture, may be inserted into the electronic document at the desired location. A mouse click (or suitable selection method) may be employed by a user, and a selectable menu option, for example, "Insert Hand-Drawn Pic", of a pop-up menu may be displayed adjacent to the desired location. Such menu selections, as well as other machine state aspects of the media activity may be sensed by one or more software sensors.
In response to one or more software sensors and the media activity, the media activity recognition component of the control logic may perceive the media activity (or may perceive an active display window providing the media activity). In particular, in response to an editing software sensor and the media activity, the media activity recognition component of the control logic may perceive a media editing activity (or may perceive an electronic document editing activity, and/or may perceive where the media content is to be inserted into an electronic document).
In perceiving and/or recognizing the media activity, the media activity recognition component 144 may (i) receive sensor data from one or more the software sensors; (ii) may compare such sensor data to stored data; and (iii) may match such sensor data to particular stored data, which is indicative of a particular media activity.
Apple's patent FIG. 9 noted above illustrates a simplified diagram of sensing an iDevice hosting media content with an iMac (#930). Remember, this isn't a design patent, so don't get caught up in the iMac's design in this illustration as it's only there as a base illustration to convey a visual of a desktop.
The iMac can also include at least one sensor (#950 which is illustrated above as the iSight camera) that may sense the media content being shown on the iDevice approaching it. The media content recognition component will take the image and transfer it to the email as illustrated above.
Elsewhere in the patent filing, they note that the user will have access to cropping tools so that an image can be straightened in the eDocument. Further, the media recognition component may include handwriting recognition software, for recognizing handwritten text of the media content that may be transferred.
Apple's patent FIG. 10 noted above is an exemplary screenshot display illustrating integration of at least a portion of media content into operation of a media activity, upon recognizing the media activity and the media content.
Apple notes that Graphical user interface controls for adjusting contrast or scale (such as a "contrast bar" or "scale bar", not shown in FIG. 10) may be used to adjust visual contrast or scale of the portion of the media content shown in FIG. 10, for more clearly perceiving the media content and/or for scaling the media content to the media activity.
Haptic Notification
And lastly, the new user interface may also comprise a haptic notification for notifying the user of the iDevice. Once the iMac has obtained the intended graphic that the user wants in their eDocument, it will send a message to the iDevice to initiate a haptic sensation to let the user know that the transfer has taken place.
Apple credits Douglas Weber as the sole inventor of this granted patent that was issued today by the US Patent and Trademark Office. Considering that the patent was originally filed in January 2010, it would mean that the application must have been made public in early to mid-2011. Translation: it would appear that Samsung may have "quickly" copied Apple's idea and brought it to market faster. So on paper at least, this would still appear to be a typical Samsung copycat move to me. How about you?
A Note for Tech Sites Covering our Report: We ask tech sites covering our report to kindly limit the use of our graphics to one image. Thanking you in advance for your cooperation.
Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.
New on Patent Bolt this Week
Microsoft Invents Controlling Multiple Displays with Eye Tracking
Google Invents a New Compass Interface for Google Maps
Spirol, the S3 was introduced in May 2012 not 2011. Check out the Wikipedia link below.
http://en.wikipedia.org/wiki/Samsung_Galaxy_S_III
Posted by: Jack Purcher | April 10, 2013 at 12:47 PM
That doesn't make any sense no matter how convoluted the logic you present. Given that the GS3 was revealed on 3 May 2011 and it would have taken months to develop, test and implement S-Beam on the GS3 prior to the reveal on May - it would be impossible for samsung to have copied Apple.
Posted by: spirol | April 09, 2013 at 10:43 PM