On October 13, 2011, the US Patent & Trademark Office published two patent applications from Apple that reveal a very detailed overview of an advanced avatar editing app. Using and creating Avatars has been a new trend emanating from Apple's engineering labs. The year began with Apple introducing us to "the Personalized Shopping Avatar" concept and have since followed through with an avatar creation tool for tablets in July and even mentioned Avatars being used in a mobile clubbing app just last week. In the bigger picture, Apple envisions the day when Avatars and related environments will be so rich that we'll be able to shop online in an Avatar centric Apple Store environment. When you think of the power of Apple's new Siri, an online Apple Rep with Siri's brain would seem very natural indeed. But for today, it's all about advanced creation and editing tools for Avatars. In the future, Apple envisions Avatars working with a wide variety of apps including but not limited to address books, chat sessions, video conferencing, email, games or any other application that could support an animated avatar. Apple is even working on a few humorous animated Avatars as well that will quickly replace antiquated emoticons. Apple obviously see's Avatars playing an expanding role in our online experiences, even when most of us just don't see it yet.
Using Avatars: Basic Overview
An avatar is a representation of a user or their alter ego. An avatar is often in the form of a three-dimensional (3D) model used in computer games or a two-dimensional (2D) icon or picture used on Internet forums, social networks and other communities. Avatars could also be used in video games, including online interactive gaming environments.
Avatars in video games are the player's physical representation in the game world. Online games often provide means for creating varied and sophisticated avatars. In some online games, players could construct a customized avatar by selecting from a number of preset facial structures, hairstyles, skin tones, clothes, accessories, etc. (collectively referred to as "elements"). Once the preset elements are selected, there is no facility for allowing users to manually adjust the elements (e.g., resize or position elements).
Overview of Apple's Proposed Avatar Editing System
Apple's patent is about an avatar editing environment that will allow users to create custom avatars for use in online games and other applications. Starting with a blank face the user could add, rescale and position different elements on the blank face, including but not limited to different eyes, ears, mouth (including teeth and smile), nose, eyebrows, hair, beard, moustache, glasses, earrings, hats, and other elements that are associated with physical characteristics of humans and fashion. The user could also change the shape of the avatar's face, the avatar's skin color and the color of all the elements.
In some implementations, touch input and gestures could be used to edit the avatar. Various controls could be used to create the avatar, such as controls for resizing, rotating, positioning, etc. The user could choose between manual and automatic avatar creation.
In some implementations, the avatar editing environment could be part of a framework that is available to applications, such as address books, text messaging, chat sessions, e-mail, games or any other applications.
In some implementations, one or more elements of the avatar could be animated. For example, the avatar's eyes could be animated to track an object in a user interface or to indicate direction. In some implementations avatar data could be stored on a network so that the avatar could be used in online applications or downloaded to a variety of user devices at different user locations.
In some implementations, a computer implemented method includes: presenting an avatar editing environment on a display of a device; displaying a three-dimensional avatar model in the avatar editing environment; receiving first input selecting an avatar element category; receiving a second input selecting an avatar element from the avatar category; rendering the selected avatar element on the three-dimensional (3D) avatar model; and receiving third input for manually editing the avatar element.
Exemplary Avatar Editing Environment
Apple's patent FIGS. 1A through 1E shown below illustrate an exemplary avatar editing environment for creating custom avatars. A user will access the editing environment by pressing an associated icon on a user's iOS device (and/or Mac via mouse). In some implementations, the avatar editing environment could be presented in a web page displayed in a browser of the portable device.
Apple's editing environment as shown above will provide a series of category pickers to assist users in setting up an avatar's nose, mouth, eyes, hair and accessories like a hat.
Exemplary Avatar Element & Color Picker Menus
Apple's avatar editing environment will provide users with Element and Color Picker menus. In the example above the Element menu is illustrating a selection of hats and the ability to choose and change colors in a secondary menu. The same applies to hair styles and colors and over time additional menus could cover such things as facial hair options for men or clothing options or accessories for women.
Manual Option for Editing Avatars
Apple's patent FIGS. 3A to 3C illustrate exemplary processes for manually editing avatar elements. After the user has created a custom avatar by selecting and adding elements, the user could manually edit those elements in user interface 104. In the contrast between figures 3A and 3B you see that the user will be able to edit the distance between the avatars eyes, for example.
As illustrated in patent FIG. 3C there will be a zoom control in the form of a magnifying glass tool.
Apple's patent FIGS. 5A to 5C shown below illustrate editing regions for editing avatar elements in the avatar editing environment. In some implementation, manual edits made by a user to an element could be restricted to defined editing regions. Using touch input or gestures, for example, the user could resize, stretch or move elements within the editing region.
In some implementations, if the user resizes, stretches or moves an element out of the editing region, the element will "snap back" to the editing region. Alternatively, the element could bounce off a virtual wall or bumper defining the boundary of the editing region when the user attempts to resize, stretch or move the element outside the editing region.
Apple's patent FIGS. 4A and 4B illustrate an alternative element picker for selecting avatar elements from a category of avatar elements. For instance, when a user wants to edit a particular element of the avatar, the user will select a corresponding zone containing the element. On an iOS device, the user will be able to touch any portion of the zone to activate the zone. In the example shown, the user has activated zone 400 containing the hat. Upon activation, buttons 408a, 408b could be displayed for selecting different hats. When a left or right button 408 is touched, a new hat slides in from the left or right of the display, respectively. The idea, as illustrated below, carries through to choosing your Avatar's eyes, nose, mouth, chin and beyond.
Many of you may have seen this kind of approach of compiling an identity on a few TV cop shows where the victim of a crime gets to work with a sketch artist who uses an app with choices as you see below.
Entertaining Animating Avatar Elements
Apple's patent FIGS. 6A and 6B as shown above, illustrate animating avatar elements to track objects in a user interface. In some implementations, elements added to an avatar could be animated. For example, elements (e.g., eyes, mouths, ears, eyebrows) can be animated to simulate human facial expressions, such as happy, sad, angry, surprise, boredom contemplation or any other human facial expression. Animations could also be applied to avatar body parts (e.g., legs, arms, head) to allow the avatar to express itself through fully body movements (e.g., a dancing avatar).
In some implementations, animations for elements could be selected and previewed in the avatar editing environment. In some implementations, the user could select (e.g., select from a menu) a particular animation for a particular element. In other implementations, the user could set the animations to trigger in response to various trigger events.
Some examples of trigger events could be user actions or context. In an email or text messaging application, if the user is waiting for a response from another user, their avatar could be animated to appear to be waiting or sleeping. For example, the avatar's eyes could be closed and the chest animated to contract and expand to simulate slow, deep breathing associated with sleeping. With a full body avatar, the avatar could be animated to tap its foot (perhaps with its arms crossed as well) simulate waiting or impatience. The mention of a "full body avatar" suggests that we're only seeing a tiny portion of this edition app and that we'll be given far more options in context with a full body Avatar image.
Avatar animations could be used in variety of applications, including but not limited to address books, chat sessions, video conferencing, email, games or any other application that could support an animated avatar.
In an address book application, when a user receives an avatar with a video card (Vcard) from another individual, the avatar could "come alive" and follow the movement of a cursor with its eyes, head and/or body when the Vcard is opened.
In a video chat environment, each party could be represented by an avatar rather than a digital image. Each party could use the avatar to track the other party's movement by controlling their respective avatar's eyes, head and body to follow the other party's avatar in a video chat room. In some implementations, an avatar viewing angle could mimic camera position.
Color Picking Tools
From a secondary patent application on avatars, we're present you with patent FIG. 2C below which shows another exemplary color picker in the form of a color wheel 204. The color wheel could show a number of discrete colors, each color occupying one slice or one region of the color wheel. The colors shown in the color wheel could be limited based on the particular element category that is currently selected. In some implementations, the color wheel could include a continuous gradient of colors, where the transition of colors from one region to an adjacent region is continuous.
Apple's patent FIG. 2G shows an exemplary color picker interface 218 for choosing colors for a multi-color element. In this example, the color picker is in the form of a crayon box. In some implementations, animation could be implemented to show the coloring process, e.g., the crayon being pulled out of the box and used to paint the selected region of the element.
An Example Game Environment
Apple's patent FIG. 3 shows an exemplary game selection interface 300 in a game environment on device 100. Exemplary game selection interface 300 allows the user to select a game from a collection of games that have been installed on the user's device or otherwise made available to the user (e.g., via download or online access). Once the user has selected a game from game selection interface 300, the user interfaces related to the selected game can be presented. In some implementations, a general application selection interface can be implemented for the user to select any application from a collection of applications that have been installed on the user's device or otherwise made available to the user (e.g., via download or online access).
Apple is serious about avatars for gaming as they discuss providing API's for game developers to take advantage of the avatar editing environment. This could be interesting over time.
Apple's main patent application was originally filed in April of this year by inventors Marcel van OS, Thomas Goossens, Laurent Baumann, Michael Lampell and Alexandre Carlhian. The filing was only six months ago, so this appears to be on the fast track. The secondary patent notes Marcel van OS as the sole inventor.
Notice: Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. Apple's patent applications have provided the Mac community with a clear heads-up on some of Apple's greatest product trends including the iPod, iPhone, iPad, iOS cameras, LED displays, iCloud services for iTunes and more. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.