A U.S. Granted Patent Reveals Microsoft's Secret Project for VR Gaming with Sophisticated Arm-Worn Multi-Faceted Controllers
When it comes to VR gaming accessories Microsoft promotes Oculus or the HP Reverb headset and controllers. While The Verge reported in March that VR wasn't a focus for the Xbox team, that hasn't stopped one of Microsoft engineering teams from patenting next-gen arm wearing controllers that are far more sophisticated than what's on the market today. Microsoft filed their patent back in March 2020 and it's already been granted in 2021. Whether Microsoft is aiming to advance VR gaming first on PC's and then the Xbox console is unknown at this time.
In Microsoft's patent background they note that in real life, humans tend to use their hands to interact with objects. They tend to reach out for such objects, touch, grasp, manipulate, and release them. In augmented reality (AR) and/or virtual reality (VR) however, such fine-grained interaction with virtual objects is generally not possible today. For instance, AR/VR headsets may track user hand positions, but cannot provide haptic feedback to his/her hands.
Hand-held controllers have been developed for AR and VR scenarios to mimic real world interactions (e.g., to provide positional information for the user's hand and/or to provide haptic feedback). Hand-held controllers exist in a variety of shapes and can perform a range of functions. While most of them track three-dimensional (3D) motion, simple controllers are designed merely for movement and button-based input.
More advanced controllers can include complex controls and provide output to the user. While most commercial devices provide only vibrotactile feedback, researchers have demonstrated a wide variety of hand-held controllers rendering texture, shape, grasp and squeeze feedback, shifting weight, and haptic behavior for two-handed use. While the capabilities of these controllers can vary, an unfortunate commonality is that the user has to basically hold them all the time or interrupt the AR/VR experience to put them down when not needed and pick them up when needed.
Thus, one problem with hand-held controllers is that the user must grasp them constantly, thereby impeding the natural use of other objects in the physical world. Particularly in VR, where a virtual environment substitutes one's view of the real world, users must often employ controllers for all virtual interactions. When the real world intrudes, it is slow and cumbersome to repeatedly pick up and put down controllers.
Another set of popular controllers includes glove-type controllers, but since these are worn, the user cannot easily disengage from the controller. Glove-type controllers typically render dexterous feedback, including pressure and vibration to the user's fingertips. However, glove-type controllers still constrain motion and hinder dexterity to use real-world tools or to quickly switch to traditional input devices, like a keyboard. The present concepts can address any of these and/or other issues.
Microsoft's invention covers concepts that relate to devices that include deployable controllers that can be employed by a user in various scenarios including AR and VR scenarios, among others.
The deployable controller can allow the user to tactilely engage virtual objects with their hand(s). The device can be secured to a body part of the user beside the hand, such as a forearm.
The deployable controller can be deployed from a storage or stowed orientation to an engagement orientation when engagement is desired and returned when engagement ceases.
Securing the device to the forearm can allow the deployable controller to be grounded to impart forces that cannot be imparted with a strictly hand-held controller. Further, storing the deployable controller can allow the user to use his/her hands in a normal unencumbered manner when the deployable controller is not being used.
Microsoft's patent FIG. 12 shows a basic overview of the system may include one or more devices like a new arm controller #110, a headset #106, a base station #102 which could be a gaming console, a desktop PC and/or other devices such as notebooks, smartphones, tablets and more.
In Patent FIG. 1A above we're able to see system #100 that relates to a VR gaming system, though it could alternatively or additionally be implemented in other use case scenarios.
In some implementations, the headset may include one or more sensors (not shown in FIG. 1A) for providing inputs to the base station and/or the headset. The sensors may include, for example, accelerometers, gyroscopes, cameras, microphones, etc. The headset is designed to detect objects in the user's surrounding, the position of the user's head, the direction the user's head is facing, whether the user's eyes are opened or closed, which direction the user's eyes are looking, a location of user body parts, such as a hand etc. The headset can have capabilities to present data, such as audio and/or visual data to the user #104.
The example system configuration of FIG. 1A above is only one of the contemplated system configurations. For instance, another system configuration can entail a device 110 that works in cooperation with an audio device, such as earphones.
More importantly, the system may further include a deployable controller device #110 which can include a base assembly #112, a deployment assembly #114, and an engagement assembly #116, which can function as a deployable controller #118.
The base assembly can be secured to a forearm #122 or upper arm #124 of the user. In some cases, the base assembly can be secured one joint above (e.g., toward the torso) the body part that engages the engagement assembly #116. For instance, in the illustrated configuration above, the engagement assembly is configured to be engaged by the user's hand and the base assembly can be secured above the wrist to the forearm. You could get a better view of the arm controller in Microsoft's patent FIG. 1B below.
(Click on image to Enlarge)
Microsoft further notes that the arm controller (device #110) can also include various positional sensors such as six-axis (e.g., 6-DOF) sensors, inertial measurement units (IMUs), etc. The positional sensors can provide data relating to a location of the device in 3D space (e.g., x, y, and z coordinates), the orientation of the device, rotation, acceleration, etc. The positional sensors 138 can be positioned on multiple assemblies or a single assembly.
Microsoft's patent FIG. 10 (A&B) below shows the engagement assembly #116 split in half so the interior contents are visible. Looking at FIGS. 1A and 1B in combination with FIG. 10, the engagement assembly can be configured to receive tactile input from a hand of the user and/or to deliver tactile output to the hand of the user.
For instance, the engagement assembly can include various input devices #126 (Shown in FIG. 1A) to detect user inputs. Examples of input devices can include pressure sensors, force sensors, such as strain gauges, capacitive touch sensor electrodes and/or user activatable switches (e.g., triggers), among others. In this implementation, there are four capacitive touch sensor electrodes inside the engagement assembly that can function to distinguish different grasps. This data can be utilized to allow the device to predict the user's intentions.
In this case, there is one area of capacitive touch sensor electrodes facing the palm which comes in contact first, then around the middle finger to detect when it is grabbed, and two patches for the thumb to be able to use as a rough position input device.
In another case, various sensors could be positioned on the base assembly. Some of these sensors could be configured to sense underlying physiological aspects of the user. For instance, the sensors could sense tendons extending from the fingers into the forearm. Information from the sensors could indicate the position of individual fingers, movement of fingers, direction of that movement, forces, such as grasping forces, etc. Alternatively or additionally, the sensors 140 could include cameras, such as IR depth cameras to provide locational data about the hand/fingers (including a thumb).
Other sensing implementations are contemplated. For instance, the device #110 could sense more user input and utilize this input to inform its haptic behavior. For example, some implementations can integrate finger tracking around the engagement assembly (e.g., such as through a self-capacitive array or a wearable camera) and could approach the user's palm and fingers during interaction and provide haptic response for dexterous input. This could also allow sensing torque on the lever, which would aid in the device's ability to simulate gravity and its rendered resistance to heavy objects.
Microsoft's patent FIGS. 2A and 2B below collectively show a technique for controlling deployment of the engagement assembly of device #110. In this case, the controller #142 can cause deployment upon detecting a specific user gesture representing a user command. In this example, the user gesture is manifest as an upward wrist-flip gesture #202 or a downward wrist flip #204. In this example, the controller can obtain data from positional sensors on the base assembly and/or the engagement assembly. The controller can interpret the data to detect a gesture.
Microsoft's patent FIGS. 4A and 4B above show device #110 with the engagement assembly in the stowed orientation. In such a case, the user is able to perform manual tasks with the hand associated with the device. For instance, FIG. 4A shows the user typing on keyboard #402 with the same hand #108 that would engage the engagement assembly in the deployed orientation.
Similarly, FIG. 4B shows the user opening a doorknob #404 with this hand. The user experiences normal dexterity with this hand as well as normal touch sensation. In contrast, with a traditional handheld controller, the user would have to set the controller down, put it in a pocket, or take some other action to free up his/her hand. Similarly, if the user was wearing a traditional glove-type controller, the user would have to deal with diminished or changed sensation that occurs when wearing a glove when trying to accomplish these tasks.
Microsoft's patent FIGS. 8A and 8B above collectively show how similar principles can be applied to rendering haptic feedback in response to larger and heavier virtual objects. FIG. 8A shows a visualization #800 of the user lifting a heavy virtual object #802 (e.g. a virtual box). The user's hands are represented by representations #804(1) and #804(2).
Microsoft's patent FIG. 8B shows a user wearing devices #110(1) and #110(2) on both hands can render the corresponding forces on the user's hands, creating the perception of lifting the object against gravity. In this case, wearing devices on both hands can create haptic feedback for bimanual interaction, like lifting heavy objects.
One of the inventors listed on the patent is Mr. Sinclair who currently works at Microsoft Research center. Sinclair does research in Bioengineering, Electrical Engineering, Materials Engineering, MEMS, Mechatronics, Haptics, AR&VR. His current project is designing rich haptic feedback VR controllers.
In February 2021 Patently Apple posted a report titled "Apple wins a VR Headset patent that describes the use of powerful cameras and connectivity to a Base Station such a Mac, Game Console +." The report also noted that it was revealed in 2020 that Apple's secretive Headset team and described two kinds of wearables, a VR Headset and Glasses. The former was to be a crazy powerful headset that connected to a Mac or high-end station that got vetoed by Jony Ive. But with Ive gone, one has to wonder if the higher end VR headset could be back in some form. This patent supports a headset taking advantage of higher-end processors in a "base station."
Just like this patent, Microsoft engineers seems to be thinking PC first and then other devices later. Though being a patent, it's difficult to say how either Microsoft or Apple will execute their VR gaming roadmaps.