FCC Chairman Pai Proposes new rules for the 6 GHz Band, Unleashing 1,200 Megahertz for Unlicensed Use
Apple may introduce Deeper iPhone Photo Depth Controls on Future iPhones

Apple Invents a Peripheral Device for a Future Headset that will enable users to move objects via a Virtual Trackpad

1 X Cover hmd system

 

Today the US Patent & Trademark Office published a patent application from Apple that generally relates to computer-generated reality (CGR) environments, and more specifically to techniques for remote touch detection.

 

Apple first lays out what the invention is to accomplish. Apple notes that CGR environments are environments where some objects displayed for a user's viewing are generated by a computer. A user can interact with these virtual objects by activating hardware buttons or touching touch-enabled hardware. However, such techniques for interacting with virtual objects can be cumbersome and non-intuitive for a user.

 

Apple's invention covers techniques for remote touch detection using a system of multiple devices, including a peripheral device that is placed on a physical surface such as the top of a table. With these techniques, a user can interact with virtual objects by performing touches on a physical surface.

 

In some embodiments, a method comprises the use of a camera sensors to obtain information about a virtual object or an augmented object seen through the Head Mounted Display device. A second method is that the system provides a virtual trackpad allowing users to move objects seen in the mixed reality headset and to use gestures like swiping or tapping.

 

Apple also could provide an alternative to a camera system by using infrared sources.

 

In the case of using cameras, Apple notes that the system #100 illustrated in the patent Figures below,  includes the option of one or more visible light image sensor(s), such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical objects from the physical environment.

 

In the case of using infrared (IR) sensor(s), Apple notes that the peripheral device could use a passive IR sensor or an active IR sensor, for detecting infrared light from the physical environment. For example, an active IR sensor includes an IR emitter, such as an IR dot emitter, for emitting infrared light into the physical environment.

 

In some embodiments, the system uses image sensor(s) to receive user inputs, such as hand gestures. In some embodiments, the system uses image sensor(s) to detect the position and orientation of the system and/or display(s) in the physical environment.

 

For example, the system uses image sensor(s) to track the position and orientation of display(s) relative to one or more fixed objects in the physical environment. In some embodiments, the system uses image sensor(s) that are inward facing (e.g., facing the user) for gaze tracking and/or hand motion tracking, which can be used, for example, to control a user's avatar.

 

Apple explains a Mixed Reality System as one that can include augmented reality and augmented virtuality.

 

An augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment, or a representation thereof. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment.

 

The system may be configured to present virtual objects on the transparent or translucent display, so that a person, using the system, perceives the virtual objects superimposed over the physical environment.

 

Alternatively, a system may have an opaque display and one or more imaging sensors that capture images or video of the physical environment, which are representations of the physical environment. The system composites the images or video with virtual objects, and presents the composition on the opaque display. A person, using the system, indirectly views the physical environment by way of the images or video of the physical environment, and perceives the virtual objects superimposed over the physical environment.

 

Apple further notes that a video of the physical environment shown on an opaque display is called "pass-through video," meaning a system uses one or more image sensor(s) to capture images of the physical environment, and uses those images in presenting the AR environment on the opaque display.

 

In addition, the system may have a projection system that projects virtual objects into the physical environment, for example, as a hologram or on a physical surface, so that a person, using the system, perceives the virtual objects superimposed over the physical environment.

 

There are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head mounted systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers.

 

Apple's patent FIGS. 1A-1B below depict exemplary systems for use in various CGR technologies; FIG. 2 illustrates an example of a system comprising a head-mounted display device and a peripheral device for enabling remote touch detection.

 

2 hmd system figs. 1a b&c

 

More specifically, Apple's patent FIG. 2 above illustrates system #200, including peripheral device #200A and head-mounted display (HMD) device #200B.

 

The peripheral device includes camera sensor(s) #210 (e.g., image sensor(s) #108) and motion sensor(s) (e.g., orientation sensor(s) #110). Additionally, HMD device #200B can itself be an embodiment of system #100.

 

Apple's system #200, including peripheral device #200A and HMD device #200B, enables accurate remote touch detection on a desktop or general table surface #206 in order to interact with (e.g., control, manipulate, activate, select) displayed UI elements in a CGR environment displayed using HMD device.  

 

In some embodiments, system #200 forms a three-angle system (e.g., one camera sensor on peripheral device 200A and two camera sensors on the HMD.

 

In some embodiments, the system #200 includes peripheral device #200A (e.g., with no camera sensors) that emits infrared light to be detected by camera sensor(s) on the HMD.

 

In some embodiments, system excludes the peripheral device and relies on the HMD device to perform remote touch detection using depth sensing technologies.

 

Apple's patent FIG. 3 below illustrates an example of a notification that prompts the user to correct an error condition; FIGS. 4-5 illustrate an example of remote touch detection in a CGR environment.

 

3 hmd system figs 3  4 & 5

 

Further, Apple's patent FIG. 3 above illustrates the perspective from which user is viewing CGR environment via a transparent or translucent display of the HMD. In some embodiments, the display of the HMD is opaque, so the user views the physical environment using pass-through video.

 

Apple's patent FIGS. 6 and 7 below illustrate additional examples of remote touch detection in a CGR environment. Here you're able to see how the user is able to use a virtual trackpad to move the photo sideways to upright using a simple hand gesture.

 

4 X HMD SYSTEM - system figs 6  7   11 & 12

 

Apple's patent FIGS. 11-12 above illustrates an alternative example of a system comprising a head-mounted display device and a peripheral device for enabling remote touch detection via infrared emitters instead of cameras to detect objects.

 

More specifically, Apple notes that "In some embodiments, peripheral device #200A includes infrared emitter(s) #220, which emit infrared light to enable remote touch detection.

 

Apple's patent application that was published today by the U.S. Patent Office was filed back in Q3 2019.  Considering that this is a patent application, the timing of such a product to market is unknown at this time.

 

Some of the inventors include Sam Iglesias, Senior Software Engineer, AR/VR; Rohit Sethi, Engineering Manager specializing in deep learning and computer vision; and Lejing Wang, Tech Lead/Engineering Manager.

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.