Apple has long been researching the field of avatars in context with next generation shopping experiences. Their initial research popped up in 2008 in context to a future avatar-centric Apple Store. A year later, Apple introduced us to the head-tracking system in relation to future avatar movements matching our real-world movements and we also learned about the future of 3D internet experiences. While the first generation of avatar creation was rather hokey, like avatars associated with Microsoft's Xbox, next generation avatars show promising realism. It's that realism that will give life to Apple's new personalized shopping avatar application. It will allow users to visualize what clothing, jewelry or accessories will look like on them before purchasing the items from a virtual retailer. The technology will also eventually apply to assisting user's redesign their living room with new furniture that they're thinking of buying or assisting doctors in their communication with patient's over the net.
Many users shop for clothing, accessories, or other objects that the user may wear in physical stores, where the users may actually try on the objects to purchase. Other users may shop for the same objects remotely, for example by catalog or over the Internet, and purchase items without first trying them on. Once the users receive the items from the retailer, the users can try the items and then decide whether or not to keep them. To entice users to purchase items remotely, many retailers or providers pay return shipping for the user to return goods that are not of interest.
Purchasing goods remotely, however, may not be convenient or easy for the user. In particular, a user may be required to guess or know what size to buy, or may need to buy each item in two or more sizes to ensure that the proper size is purchased. This, in turn will force the user to return at least some of the purchased items. In addition, a user may be forced to rely on static images of the clothing or accessories, or images of the clothing or accessories on models selected by the retailer. In particular, there is no easy way for the user to see what an article of clothing or an accessory will look like on the user's own body.
Apple's solution for this involves their new invention which covers systems, methods and computer-readable media for providing a personalized avatar for previewing articles wearable by the avatar. The avatar dimensions, shape and general appearance could be defined to substantially match those of a user, such that articles of clothing, accessories or other objects placed on the avatar could substantially represent how the articles would appear on the user.
For example, the avatar could be defined to have the user's skin tone, facial features (e.g., nose shape and size, eye position, hair color), body measurements (e.g., hips, waist, and shoulders) such that the avatar provides a substantially accurate graphical representation of the user's body. The avatar could be defined using any suitable approach. For example, the user could provide measurements of the user's body to an electronic device defining the avatar.
As another example, the user could provide one or more pictures of the users from which an electronic device could extract relevant measurements and define the avatar. As still another example, the avatar could be defined from clothing and other accessories that the user owns and wears. In particular, the electronic device could retrieve metadata associated with the user's clothing to identify measurements of the user.
To shop remotely, the user could direct an electronic device to apply selected articles of clothing, accessories, or other items to the user's avatar. Using metadata and fixed points of the avatar and objects, the electronic device could place the user's selected objects on the avatar. The material used for the selected objects could stretch or sag in a manner representative of the manner the object may be worn by the user. For example, metadata associated with the selected item could specify the manner in which the item is worn. In some embodiments, a user could simultaneously select several items to wear such that the several items are overlaid on the avatar (e.g., clothing metadata defines how a shirt worn over a t-shirt will appear).
In some embodiments, an electronic device could recommend different clothing items or other objects based on the user's body type. For example, the electronic device could access a database of clothing items purchased by users having different avatars, and recommend clothing based on what users having similar avatars purchased. In addition, one or more retailers could provide recommendations to a user based on the user's avatar. Once a user has applied one or more objects to an avatar, the user could share the avatar, with the objects, to other users or friends to receive the friend's recommendations or comments regarding the proposed outfit or look.
For the system to work, the user will be required to use computer hardware or devices that incorporate a camera and have a connection to the internet. Primarily this would mean an iPhone, iPad (3G) or one of Apple's MacBooks or desktops. The iPod touch would work via WiFi.
Customizing Your Avatar
In Apple's patent FIG. 2 shown below we see a schematic view of an illustrative avatar displayed by an electronic device such as an iPhone Avatar 200 could include face 210, upper body 220 and lower body 230. The user could spin, rotate or move the avatar to view any perspective by simply flicking the iPhone's touchscreen. The avatar could include any suitable clothing or other accessories, such as shirt, pants or shoes. Additionally, the avatar could include any other suitable clothing item, garment, accessory, or other wearable object, including for example jewelry, bags, scarves, watches, glasses or other eyewear, hats, or any other object.
Apple's patent FIG. 3 is a schematic view of an illustrative display for building an avatar based on a user's personal measurements. Display 300 noted above could include a listing of measurements and other information that the user could provide to define the user's body. For example, the user could specify whether the user is male or female, the user's skin tone (e.g., by selecting a tone from a palette of available colors), and body measurements such as height, weight, neck, chest, shirt sleeve, waist, bust length, hip, thigh, and calf measurements. The electronic device can direct the user to provide other measurements instead of or in ad addition to the measurements listed above (e.g., back width, back length, waist to underarm, wrist to underarm, armhole depth, one shoulder).
Setting the Avatar in Motion
In some embodiments, the objects that a user may apply to the avatar could include metadata or other information defining the manner in which the objects will move and change when the avatar moves. For example, a user may direct the avatar to sit down, walk, run, bend, move arms or legs, or move in any other way to preview the appearance of objects as the user moves.
Apple's patent FIG. 4 is a schematic view of an avatar as the avatar moves in accordance with one embodiment of the invention. The user could direct the avatar to move in any suitable manner, including for example by selecting and moving (e.g., dragging) particular elements of the avatar (e.g., drag an arm or a leg) using an input interface (e.g., a touchscreen).
In some embodiments, display 400 could include an animation of the avatar moving from the positions shown (e.g., starting seated with avatar 460 and ending as the avatar walks with avatar 410). The user could rotate, tilt, pan, or otherwise move the display to see the avatar under a different configuration than that initially shown (e.g., see the avatar from the front or back, instead of from the side).
Social Networking Tie-In
Apple's patent states that their avatar app will be able to be combined with social networking apps like the one they proposed in 2010. For example, a user could send their friends their avatar with a new outfit on for comments or recommendations. This app will work very much like Apple's proposed virtual closet app that was published in 2010.
Other Applications: Medical, Furniture Layout, Pet Accessories
Although the preceding discussion describes avatars in the shapes of human beings, it will be understood that the principles of the embodiments described above can be applied to any three-dimensional object that the user would like to preview with an overlaid or applied object.
For example, some embodiments could be applied in the context of medical fields, for example to preview the appearance of prosthesis on a user's avatar, or to preview the result of inserting a surgical object within a user (e.g., preview the appearance of a metal plate in a shoulder).
As another example, an avatar could be created for inanimate objects (e.g., furniture) to preview the appearance of the object in a particular location, or to preview a second object being placed on or in the object (e.g., preview a couch when a person sits on it, or a cover for a sofa).
And finally, it will be understood that an avatar could also be created for a pet or other animal (e.g., to preview a pet collar or pet clothing).
Apple credits Cindy Lawrence, Victor Tiscareno and Stanley Ng as the inventors of patent application 20110022965, originally filed in Q3 2009.
Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for further details. Patents shouldn't be digested as rumors or fast-tracked according to rumor time tables. Apple patents represent true research that could lead to future products and should be understood in that light. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.