Samsung & Huawei Lower their Smartphone Sales Forecasts for 2016
Korea's Fair Trade Commission Confirms Investigation into Apple's Alleged Unfair Contracts

Intel Patent Reveals RealSense Camera System adding a New Dimension to Gaming

10.8 NON APPLE PATENTS

 

1AF 55 COVER INTEL REALSENSE SYSTEM FOR GAMING

 

This will be the year that we begin to see Intel's RealSense camera come to market in a wide range of products from PCs to smartphones and tablets. Intel introduced their 3D camera back at CES 2014. Since then Intel has shown prototypes for smartphones and tablets. The sexy Asus iMac-like desktop and HP's 34" desktop both feature Intel's RealSense camera that acknowledges in-air hand gestures and instant login with 3D facial recognition. Intel continues to advance their 3D camera technology with an emphasis on in-air gestures for video gaming as revealed in a patent filing that surfaced at the U.S. Patent and Trademark Office last week.

 

For Apple fans it's interesting to follow the progress of Intel's RealSense technology because Apple could at any time add a similar dimension to their iSight camera for both Macs and/or iDevices via technology that they acquired from PrimeSense. In fact we're likely to see Apple introduce PrimeSense technology with their all-new dual lens camera coming to the iPhone 7 Plus in September.

1BBBB 55 asus desktop with realsense camera

According to Intel's patent filing, "The three-dimensional space can be characterized by targets, obstructions, and fields in, for example, a computer gaming environment in which, due to the physics characteristics of those objects, they interact with user gestures that are applied to virtual objects. Three-dimensional physics effects can be represented in this three-dimensional space. In this three-dimensional space, games and other applications can combine forces from targets, obstructions, and fields with forces from air gestures to provide a more complex, interactive, or realistic interaction with a user."

 

In Intel's patent FIG. 1A noted below we're able to see a diagram of an air gesture system having a display coupled to an array of cameras #103 and an array of microphones #105. In the illustrated example, there are two cameras and two microphones – though a larger or smaller number of cameras or microphones may be used for more or less accurate sensing of position and direction.

 

The display may be a direct view or projection display on any type of display technology. As shown the camera microphone arrays are position over and attached to the display. However, any other position may be used. The camera and microphone may be positioned apart from each other and apart from the display. The arrays can be calibrated for or configured with knowledge of the position of the display in order to compensate for offset positions. The display may be a part of a portable computer, a gaming console, a handheld smartphone, personal digital assistant, or media player. Alternatively, the display may be a large flat panel television display or computer monitor

 

In the example shown, the display shows three submarines #109 from a side view progressing through an undersea environment. A user shown as a hand #107 performs air gestures to direct torpedoes #111 at the displayed submarines. The user air gestures are detected by the cameras to execute a command to fire torpedoes. The system uses a gesture library for the undersea environment that contains possible gestures. When the hand performs a gesture, the system compares the observed gesture to the gesture library, finds the closest gesture, then looks up the associated command, such as fire torpedoes.

 

2AF 55

 

Intel's patent FIG. 1B noted below we're able to see the same camera and microphone arrays and the same submarines. However, in FIG. 1B the submarines are viewed from the top for example from the surface of the water or from a shallow depth downwards towards the submarines. The user 107 is performing the same air gesture which instead results in the release of depth charges 113 down toward the submarines.

 

As can be seen, depending on whether the view of the submarines is from the side as in FIG. 1A or from the top as in FIG. 1B, the same finger pinch-release gesture as illustrated can result in different actions. In the example of FIG. 1A, the user gesturing from the side can make a throw gesture with a pinch and release to cause a torpedo to go towards the targets. In FIG. 1B the same pinch release can cause depth charges to be dropped toward the targets on the screen.

 

While the gestures are the same, the system can determine the current view whether from the side or the top to determine whether the gesture is interpreted as a release of torpedoes or as a release of depth charges. As a result, a user can use an intuitive gesture which is simple to perform to cause different commands to be executed by the system.

 

3AF 55

In Intel's patent FIG. 1C noted below we're able to see the same two displays side-by-side. In the illustrated example, both displays have a camera and microphone array, however, a single camera and microphone array may be used. These arrays may be connected to either display or located in a different position. In this example, each display 101a and 101b show the same three submarines, one shows the submarines 109a from the side while the other shows the submarines 109b from the top. The user can either throw torpedoes or drop depth charges on the same submarines depending upon which screen is being used, or is current or active, at the time.

 

As shown, the environment presents two displays that present the same three submarines simultaneously. The gesture, such as a pinch-release gesture, does not indicate which display the user intends, so that the system does not know whether to produce the torpedo command or the depth charge command. In this example, the camera array on one or both of the screens can determine which screen the user intends. For example, by tracking the users face, eye focus, or voice direction, the system can determine which screen the user is focusing attention on and then activate the corresponding command for that screen.

 

4AF 55

In Intel's patent FIG. 2B noted below we're able to see an additional screen #131 has been added. This screen is shown as a portable device such as a smartphone or portable gaming system.

 

The smartphone or tablet's smaller display is placed, in this example, in front of the main large display. The system can determine the position of the smaller screen and present a portion of the three dimensional space that lies in the plane of the small screen. So for example in FIG. 2B, the user has thrown a space ship #127 toward the planet #121 and in particular at the target #125 on that planet. After the spaceship has been thrown it first appears on the smaller screen.

 

5AF 55

As shown, we're able to see object #129 on the smaller screen that isn't visible on the main screen. This object 129 is in the form of another moon which can exert a gravitational or other force on the spaceship #127. As the spaceship continues through the three dimensional space, it will leave the smaller display and after some time show up on the large display.

 

The addition of the small screen adds a new dimension to this particular type of game play. The main camera array or some other proximity sensing system can determine the position of the small screen in real time. The user can then move the small screen around to see objects that are not displayed on the main screen. As a result, upon throwing a space ship #127 in the example of FIG. 2A, if the course and velocity of the spaceship is significantly altered, the user can use the small screen to find which objects have influenced its path and compensate accordingly. The small screen can be moved around in different planes on the z-axis to see what is in front of the large screen. A similar approach can be used to see what is beside or behind the large screen.

 

I could definitely see this aspect of the RealSense camera technology adding a very cool dimension to gaming in the future.

 

RealSense Camera System Overview

 

6af overview of Realsense camera intel

Intel filed their latest patent application in Q4 2015 while their work on this began sereval years ago. Considering this is a patent application it's unknown at this time as to when Intel's RealSense advancements will be seen in the market and which gaming companies will be working with intel to advance gaming.

 

10.6  PA - Bar - Xtra News

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments.

 

 

Comments

The comments to this entry are closed.