Apple is working on an XR game playing system using FaceTime that will allow multiple users to participate on using various devices
Last Thursday the US Patent & Trademark Office published a patent application from Apple that relates to using sensor data to provide shared experiences via multiple electronic devices. More specifically, the patent covers playing cards, board or dice games between multiple users online providing an XR experience on iPhones, iPads, Vision Pro and other projection systems. It appears that the game play may be a part of a future version of FaceTime that Apple refers to as a "communication session."
Tracking Objects wit Fiducial Markers in Multiple Environments to Provide Shared Experiences
Apple's patent covers devices, systems, and methods that provide shared extended reality (XR) experiences in which two or more users interact with their own sets of physical objects (e.g., cards, game pieces, dice, chips, etc.) during the shared experiences.
Each user may have multiple physical objects and each of those physical objects may have the same generic shape and size but have a unique fiducial marker. The unique fiducial marker of each physical object can be assigned to represent one of multiple virtual content items and enable physical interaction with virtual content items.
The system may identify generic physical objects that will be used by each of the users and associate those objects with specific virtual content items to enable the users to interact with the virtual content items via the physical objects.
Apple's patent FIGS. 1 and 2 below illustrate electronic devices (iPhones) #105 and #205 that are involved in a communication session (FaceTime) with one another to provide a shared XR experience to the two devices' users #110 & # 210.
The basic Mechanics of the Patent
In Apple's patent FIG. 1 below, the iPhone is operating in physical environment #100 is a room that includes a table (#140) and a set of cards (e.g., cards #120a-d and the other cards in the card piles #130a-b. The cards #120a-d and the other cards in the card piles (#130a-b) include fiducial markers which, in this example, are illustrated as unique dot patterns, e.g., the card #120a includes one or more depictions of a unique fiducial marker #125a, the card #120b includes one or more depictions of a unique fiducial marker #125b, the card #120c includes one or more depictions of a unique fiducial marker #125c, the card 120d includes one or more depictions of a unique fiducial marker 125d, etc.
The unique fiducial markers may be depicted on only one side or on both sides (e.g., front and back) of each card. Any type of fiducial markers may be used including, but not limited to, alphanumeric numbers, symbols, patterns, bar codes, or other codes that can be depicted via a sensor. Fiducial markers may be visible or invisible (e.g., detectable via IR detection, etc.).
Fiducial markers may be one dimensional, two dimensional, or three dimensional. Fiducial markers may utilize unique colors or color combinations. Fiducial markers may utilize unique shapes, sizes, or other appearance attributes of physical objects. The fiducial markers may include read direction indicators and/or be omnidirectional (e.g., capable of being interpreted without identifying direction/orientation).
In patent FIG. 1 below, the iPhone includes one or more cameras, microphones, depth sensors, or other sensors that can be used to capture information about and evaluate the physical environment and the objects within it, as well as information about the user of the iPhone. The information about the physical environment and/or user may be used to provide visual and audio content, for example, during a shared extended reality (XR) experience provided during a communication session involving one or more other devices.
For example, a communication session may provide views to one or more participants of a 3D environment that is generated based on camera images and/or depth sensor images of the physical environment #100 as well as representations of user #110 based on camera images and/or depth sensor images of the user. The sensor data may be used to identify fiducial markers and/or the 3D positions of and/or movement of objects that have fiducial markers.
The iPhones provide their respective users (#110 & #210) with views of an XR environment. In contrast to a physical environment that people can sense and/or interact with without aid of electronic devices, an extended reality (XR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic device.
For example, the XR environment may include augmented reality (AR) content, mixed reality (MR) content, virtual reality (VR) content, and/or the like. With an XR system, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner that comports with at least one law of physics.
As one example, the XR system may detect head movement and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment.
There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include, but are not limited to, smartphones, tablets, and desktop/laptop computers, head mountable systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback
A head mountable system may be configured to accept an external opaque display (e.g., a smartphone). The head mountable system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment.
The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In some implementations, the transparent or translucent display may be configured to become opaque selectively.
Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
To review its full details, review patent application 20230293998.