Apple and Partner Wistron are Offering additional assistance to Japan Display to Remain Viable
After Google and Facebook, Apple has now Committed Billions to the Crippling Housing Crisis in the Bay Area

Apple patent Describes Tangibility Visualization of Virtual Objects within Various Headset Environments

1 Cover Apple patent headset envrionments


Today the US Patent & Trademark Office published a patent application from Apple that relates to a computer-generated reality environment, and more specifically to techniques for providing tangibility visualization of virtual objects within a computer-generated reality environment found within a headset. Apple distinguishes between Computer Generated Reality (CGR) environments such as VR, AR, MR (mixed reality) and AV (augmented virtuality) that may be supported in a future Apple headset (or varying headsets).


Apple begins by noting that Computers can completely project or partially superimpose computer-generated images on a user's view to provide a computer-generated reality environment that can be experienced by the user. A computer-generated reality environment can be based on different types of realities.


A headset optionally detects the user's real movements and projects and simulates those movements within a series of visual images or video of the computer-generated reality environment.


Through these movements projected or simulated within the computer-generated reality environment, the user can interact with objects within the computer-generated reality environment.


Apple's invention covers techniques for providing tangibility visualization of virtual objects within a computer-generated reality (CGR) environment, where the CGR environment provides a user interacting with the CGR environment with a realistic and immersive experience.


Because the experience is realistic and immersive, the user can easily confuse a virtual (and thus intangible) object within the CGR environment as being a real tangible object that exists outside of the CGR environment.


Thus, the described techniques enhance user convenience and further provide the user with an enhanced degree of safety when interacting with a CGR environment by enabling the user to quickly and easily visually recognize whether an object within the CGR environment is a non-tangible virtual object or corresponds to a real, and thus tangible, object in the real environment.


The techniques are not limited to providing tangible visualization to a particular type of CGR environment, but rather can be implemented in any type of CGR environment. These environments include, for example, CGR environments based on mixed reality and CGR environments based on virtual reality.


A virtual reality (VR) environment (or virtual environment) refers to a simulated environment that is designed to be based entirely on computer-generated sensory inputs for one or more senses. A VR environment comprises a plurality of virtual objects with which a person may sense and/or interact. For example, computer-generated imagery of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with virtual objects in the VR environment through a simulation of the person's presence within the computer-generated environment, and/or through a simulation of a subset of the person's physical movements within the computer-generated environment.


A mixed reality (MR) environment refers to a simulated environment that is designed to incorporate sensory inputs from the physical environment. On a virtuality continuum, a mixed reality environment is anywhere between, but not including, a wholly physical environment at one end and virtual reality environment at the other end.

Examples of mixed realities include augmented reality and augmented virtuality.


An augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment, or a representation thereof. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present virtual objects on the transparent or translucent display, so that a person, using the system, perceives the virtual objects superimposed over the physical environment.


An augmented virtuality (AV) environment refers to a simulated environment in which a virtual or computer generated environment incorporates one or more sensory inputs from the physical environment. The sensory inputs may be representations of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but people with faces photorealistically reproduced from images taken of physical people. As another example, a virtual object may adopt a shape or color of a physical article imaged by one or more imaging sensors. As a further example, a virtual object may adopt shadows consistent with the position of the sun in the physical environment.


There are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head mounted systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses).  A head mounted system may have one or more speaker(s) and an integrated opaque display.


Rather than an opaque display, a head mounted system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes.


The display may utilize digital light projection, OLEDs, LEDs, uLEDs, liquid crystal on silicon, laser scanning light source, or any combination of these technologies. The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In one example, the transparent or translucent display may be configured to become opaque selectively.


Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.


Apple's patent FIGS. 1A and FIG. 1B below depict exemplary system #100 for use in various computer-generated reality technologies.


In some examples, as illustrated in FIG. 1A, system #100 includes device 100a. Device 100a includes various components, such as processor(s) 102, RF circuitry(ies) 104, memory(ies) 106, image sensor(s) 108, orientation sensor(s) 110, microphone(s) 112, location sensor(s) 116, speaker(s) 118, display(s) 120, and touch-sensitive surface(s) 122. These components optionally communicate over communication bus(es) 150 of device 100a.


2 Apple patent FIGS 1A AND 1B


In some examples, elements of system #100 are implemented in a base station device (e.g., a computing device, such as a remote server, mobile device, or laptop) and other elements of the system #100 are implemented in a head-mounted display (HMD) device designed to be worn by the user, where the HMD device is in communication with the base station device.


As illustrated in FIG. 1B above, in some examples, system #100 includes two (or more) devices in communication, such as through a wired connection or a wireless connection (HMD + MacBook or iPhone or iPad).


As an overview, Apple's patent FIGS. 2A-2D below illustrate an exemplary technique for providing visual feedback indicating tangibility in a CGR environment that includes only virtual objects.


3 FINAL - -  fig. 2a apple mr patent


Apple's patent FIG. 4A below is a flow diagram illustrating a method for providing visual feedback indicating tangibility within a CGR environment.


4 - Apple patent fig 4a flow chart for CGR ENVIRONMENTS


Apple's patent application that was published today by the U.S. Patent Office was filed back in Q2 2019 with work dating back to Q2 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.


Apple Inventors


Alexis Palangie: Senior Software Engineer. Palangie previously worked for Oculus VR, Ubisoft and LucasArts.   


Avi Bar-Zeev: Sr. Manager, Prototype Development. Bar-Zeev was reportedly a co-founder of Microsoft's HoloLens and other major VR and AR projects. He left Apple in January according to Variety.  


10.51FX - Patent Application Bar


The comments to this entry are closed.