On July 7, 2011, the US Patent & Trademark Office published a patent application from Apple that reveals various concepts behind newly advanced 3D gesturing that will apply to CAD applications for product and gaming developers as well as for consumers. According to Apple, next generation iPad and/or other iOS device displays will allow consumers to create avatars for 3D environments or assist homeowners in designing new landscapes and more by using simple 3D gesturing. The new 3D gesturing will control color and textures while allowing users to uniquely rotate objects to gain different perspectives of their designs. This is wild stuff that is bound to give Apple's competitors another huge headache.
Overview: Who will benefit from 3D Gesturing?
As a basic overview, Apple introduces us to various new gesture inputs that will be used to render complex 3D objects. For example, a product design house will be able to use this futuristic iPad (device 100) to quickly generate 3D models of consumer products. Video game developers could use it to quickly generate 3D models of figures in video games. Users could use it to quickly generate avatars for use in video conferencing applications. Users of a virtual 3D environment could quickly generate or modify avatars or objects in the virtual 3D environment. Homeowners could generate 3D models of their houses based on aerial photographs, and add the 3D models to a map application.
By providing a convenient and intuitive way to generate and modify 3D objects based on 2D objects in photographs, a 3D model of a community or an entire city could be generated through the cooperation of the residents of the community or city, each individual using the device to modify computer-generated 3D models of buildings or landscapes that the individual is familiar with.
Apple Introduces 3D Gesturing Inputs
In this patent, Apple introduces us to 3D gesturing inputs. Apple's patent FIG.1 illustrates an iPad (device 100) which includes a touch-sensitive display that is responsive to touch inputs, 2D and 3D gesture inputs. An application program, such as a CAD program, can be executed on device 100 to enable a user to generate and manipulate 2D and 3D objects.
3D gesture inputs are described in terms of the movements of the user's fingers. The user could also provide 3D gesture inputs using other pointing devices, such as styluses, or a combination of fingers and pointing devices. For example, the user may use the left hand fingers in combination with a stylus held in the right hand to provide 3D gesture inputs.
The device, such an iPad, is intuitive to use because objects could be shown in drafting area 104, and the user can touch and manipulate the objects directly on the display (as compared to indirectly interacting with a separate touch-pad). In some implementations, the CAD program allows the user to generate 3D objects from 2D objects. For example, a user could generate a 2D object using a multi-touch input, and then lift the fingers simultaneously to extrude the 2D object to form a 3D object.
Generating, Modifying, and Manipulating 3D Objects Using 3D Gesture Inputs
Referring to FIGS. 2A and 2B shown above, one type of 3D gesture input could include touching display surface 118 at multiple touch points 160 and pulling up 162 the fingers for a distance. Referring to FIG. 2C, the 3D gesture input shown in FIGS. 2A and 2B will be represented by double circles 164 indicating touch points on display 102 and dashed lines 166 indicating movements of the fingers or pointing devices.
Now referring to FIGS. 3A and 3B shown below, a user could generate triangular prism 128 based on triangle 122. Apple's patent FIG. 3A shows a sequence of finger movements that define a 3D gesture input for generating the triangular prism. Patent FIG. 3B shows graph 132 that includes triangle 122 and graph 134 that includes the triangular prism which the user sees on the display. The user could generate a triangle by using three fingers to touch display surface 118 at three touch points 120a, 120b, and 120c. The user lifts or pulls up 124 the three fingers substantially perpendicular to the surface at substantially the same time to a distance from display surface 118, and pauses 126 for at least a predetermined time (for example, one second). These movements indicate a 3D gesture input that is associated with extrusion of triangle 122, resulting in a triangular prism having a cross section that corresponds to triangle 122.
Apple's patent FIGS. 4A and 4B illustrates the generation of a cube or a rectangular prism by extruding a square or a rectangle, respectively. FIGS. 5A and 5B illustrates the generation of a cylinder by extruding a circle. FIG. 5A shows a sequence of finger movements that defines a 3D gesture input for generating a cylinder. FIGS. 7A and 7B, a user could generate a pyramid with a triangular base from a triangle. And finally, FIGS. 9A and 9B we see that a user will be able to generate a cone from a circle.
Apple Introduces the Pinch-and Pull Gesture
Next up, Apple introduces us to the Pinch-and-Pull gesture. Apple's Patent FIG. 12 shows us a sequence of gesture inputs for generating a raised portion on a surface. Assume that surface 280 represented by a mesh has been generated and a top view of that surface 280 is shown on the display. The user could apply pinch-and-pull gesture input 282 to location 284 on the surface to cause location 284 to be "pulled up" to form raised portion 286, in which location 284 becomes the highest point of raised portion 286.
Surface 280 initially could be either a 2D object or a 3D object. If surface 280 is initially a 2D object, when the pinch-and-pull gesture input is applied to the 2D surface, the 2D surface is transformed into a 3D surface having a raised portion.
The width of raised portion 286 could be defined by a sliding ruler, or by another gesture input. For example, the user could use the left hand to provide a touch input that includes two touch points in the input area 108 (FIG. 1). The distance between the two touch points defines the width at half height of the raised portion. For example, if the height of the raised portion is H, then the width of raised portion at height H/2 will be equal to the distance between the two touch points in input area. The raised portion could have a mathematically defined surface profile, such has having a cross-sectional profile (for example, along the x-z or y-z plane) resembling a Gaussian curve or other curves.
For example, the user could change the distance between the two touch points in the input area while pulling up the fingers in the pinch-and-pull gesture input to modify the cross sectional profile (along the x-y plane) of the raised portion at various heights.
Apple Introduces the Rotation Gesture
Next up, Apple introduces us to new rotation gestures. According to Apple's document, when the user pulls up their fingers, the user will initially see a top view of surface 280, including the raised portion. The user could then apply rotation gesture input 288 (as shown in graph 290) to rotate the surface along the axis that is perpendicular to the display surface. Here, the z-axis is perpendicular to the display surface, so rotation gesture input causes the surface to rotate about the z-axis. The rotation gesture input includes touching display surface at two touch points 292a and 292b, and sliding the fingertips in a circular motion 294. The user sees rotated surface 280 as shown in graph 296.
Applying Color & Texture to Surfaces
In another aspect of the patent, Apple states that a CAD program may allow the user to apply color and texture to surfaces of objects. The proximity sensor of the device may be used to allow the user to conveniently select different mixtures of color or texture components by adjusting the distances of different fingers relative to the display surface.
For example, referring to FIG. 14, the CAD program may designate regions 376a, 376b, and 376c in input area 108 for controlling red, green, and blue colors, respectively. The user may provide touch input 372 to surface 378 of object 374 shown in the draft area to the select surface and place three fingers above regions 376a-376c to control the color of the surface. The relative heights of the fingertips 370a, 370b, and 370c above regions 376a-376c respectively indicate the relative weights of the red, green, and blue colors in the color of the surface. For example, pulling up the fingertip 370b will increase the green component in the color of the surface, and pushing down fingertip 370c will decrease the blue component in the color of the surface.
Virtual Slide Bars to Control Color, Brightness & Contrast
According to Apple, controlling the relative weights or portions of the red, green, and blue colors could also be achieved by using three slide bars with each slide-bar controlling one of the red, green, and blue colors. The advantage of using the technique shown in FIG. 14 is that the area occupied by regions 376a, 376b, and 376c could be made smaller than the area needed for three slide bars. This is useful when the screen size is small, such as when the display is part of a portable device, such as a mobile phone, personal digital assistant, game console, or digital camera.
Apple's patent FIG. 15A shows an example in which the CAD program designates regions 380a and 380b in input area 108 for use in controlling relative weight of a first texture and a second texture applied to a selected surface of an object. FIG. 15B shows an example in which the CAD program designates regions 390a and 390b in input area 108 for use in controlling brightness and contrast, respectively, of a selected region or surface of an object. FIG. 16 shows an example in which the CAD program accepts 3D gesture inputs for controlling hue, saturation, and brightness. Slide bars 400a, 400b, and 400c are provided for controlling the red, green, and blue color components, thereby controlling hue. FIG. 17 shows a sequence of gesture inputs that can be used for generating a 3D dice.
Apple's patent application 20110164029 was originally filed in Q1 2010 by inventors Nicholas King and Todd Benjamin.
Notice: Patently Apple presents a brief yet detailed summary of patent applications with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. Apple's patent applications have provided the Mac community with a clear heads-up on some of Apple's greatest products including the iPod, iPhone, the iPad, iOS cameras, LED displays, iCloud services for iTunes and more.
About Comments: Patently Apple reserves the right to post, dismiss or edit comments.
Here are a Few Great Community Sites covering our Original Report
MacSurfer, Twitter, Facebook, Apple Investor News, Google Reader, UpgradeOSX, TechWatching, Techspy, Macnews, Trendbird Korea, i-ekb Russia, iPhone World Canada, CBS MarketWatch, Engadget US and Engadget Spain, DMXzone, Melamorsicata Italy, Engadget China, Mac+ Brazil, KnowYourMobile UK, PC Advisor UK, Macworld UK, MacNation Denmark, Gizmodo Germany, Macworld Spain, and others.