Apple Moves OS X 3D Closer with Unique Ambient Light Feature
Apple has been working towards a 3D version of OS X for some time. In late 2008 we first learned about a "multi-dimensional" OS and later in December 2009 Apple revealed a 3D version of OS X using head-tracking technology. In today's patent, we get to see another angle that Apple is working on that somewhat relates to their 2008 patent and yet is unique. As the graphic above illustrates, Apple will employ advanced ambient light sensors on their hardware that could interact with surrounding light. In the graphic above, an in home lamp is reflecting on the laptop screen and casting shadows on elements on the display. This is yet more proof that Apple is on track for an advanced 3D OS sometime in the future as hardware adopts more cores. Intel had forecasted at CES this year that Sandy Bridge based systems would kick start 3D internet and advanced avatar creating software in 2011.
It's All About Ambient Light Sensors & OS X
Apple states that embedded light sensors capable of sensing ambient light conditions could typically be used to make brightness adjustments to the display based on the sensed ambient light level. For example, a device may automatically increase the brightness level of an associated display screen in a strong, bright ambient light environment. Similarly, the same device may automatically lower the brightness level of the display screen when a dim ambient light environment is detected. Adjusting the brightness of a display screen in this manner based on ambient light intensity is intended to result in a better viewing experience for the user, and it can as well provide power saving benefits, for instance when the power to the display is decreased during dimming.
Apple's patent appreciates the fact that imagery is typically presented on display screens of all types. In order to make the images more engaging, it has also been appreciated that if more specific sensed ambient lighting conditions (direction, intensity, color and the like) could be applied to the imagery, those images would be more realistic in appearance, and therefore more natural and pleasing to the viewer. Among other enhancements, characteristics of the imagery could be altered in view of actual ambient lighting conditions around the device. For example, the shading or brightness of the imagery could be made to correspond to the ambient light characteristics around the display screen.
As an example, if a user is looking at a display of a notebook computer positioned on a desk and a lamp is positioned to the right of the notebook (see FIG. 6A below as an illustrative example), it would be much more realistic if imagery on the display such as icons, windows and other graphical user interface elements were adapted to appear as though that shining lamp was affecting their appearance. In this context, the icons, windows and other graphical user interface elements would appear as though the light from the lamp on the desk was actually also shining on them. This could be accomplished by, among other things, altering the shading of the images to add shadow effects away from the light source and to add brightness effects toward the light source.
Apple's patent capitalizes on these lighting-induced naturally occurring effects that users have come to expect in the real world by adjusting the presentation of displayed images in dependence on the sensed ambient lighting conditions about the display screen. Among others, the sensed lighting characteristics could include direction, magnitude and color from multiple light sources that would affect the appearance of objects located where the display screen is positioned.
Apple's patent FIG. 6A illustrates a device displaying ambient light affected-appearing-imagery in a scene. A notebook computer 604 is shown. A lamp 602 projects ambient light towards the notebook computer. The notebook features a first light sensor 614, a second light sensor 616, a third light sensor 618, and a fourth light sensor 620. The light sensors sense ambient light generated by lamp 602. The display shows a scene which includes three-dimensional objects. Specifically, the three-dimensional objects include a sphere 610 and a cube 612. In this embodiment, the three-dimensional objects sphere 610 and cube 612 are stored and displayed in the scene as three-dimensional models. The first light sensor 614, the second light sensor 616, the third light sensor 618, and the fourth light sensor 620 detect ambient light, including that from the lamp 602, and the computer displays an ambient light affected image of the constructed scene on the display.
Many modern systems have "GPU" (Graphics Processing Unit) components that, together with software, are used to do accelerated 3D scene rendering as is the typical case for games. This same technology is commonly used in modern graphical environments, such as OS X, from Apple Inc., to provide the rendering of the display for applications and visual effects of the environment and within applications.
Apple credits Gregor Purdy as the sole inventor of patent application 20100103172, originally filed in Q4 2008. For more information on 3D related Apple patents, check out our Tech:3D section.
Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for further details. For additional information on any patent reviewed here today, simply feed the individual patent number(s) noted in this report into this search engine. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.
Update May 4, 2010 9 AM: It appears that Google is taking the news of a future Apple 3d OS pretty seriously considering that they just acquired BumpTop on May 2, 2010. Here's a video of it if you've never heard of BumpTop before.
As usual, apple gets distracted with shinny - instead of delivering real added value.
If apple wants to really leverage 3D in the uxp then there is one simple place to start with: Hyperbolic Tree
... and when driven by some serious RDF semantic technology (daml+oil driving the W3C Sparkle query system) then apple would really have something to write home about! That would be a real game-changer ...
but alas, Apple has abandoned osx on the mac :-(
Posted by: Zahadum | April 30, 2010 at 12:06 AM