Apple wins a major Mixed Reality Glove Patent that uses Multi-Segment Force Sensors and much more
Apple has many VR Glove patents on record to date (01, 02, 03, 04 and 05) and today the U.S. Patent and Trademark Office officially granted Apple another VR/MR Glove patent that covers advanced multi-segment force sensors.
According to Apple, mixed-reality applications, in which the view made visible to a participating individual or user may comprise real-world objects superimposed with virtual objects or supplementary information regarding the real-world objects, are an increasing focus of research and development. Many aspects of mixed-reality applications rely on video data captured using a combination of sensors--for example, data frames representing the scene visible to a user may be captured, together with the direction of the gaze of the individual, and the data frames may be analyzed and augmented with virtual objects before being re-displayed to the user.
Vision and hearing-related sensor data for mixed-reality applications can be captured effectively using head-mounted devices and the like. However, data associated with other senses such as touch may also be relevant for at least some types of mixed-reality applications, and it may not be possible to detect such data using video/audio sensors. Capturing and interpreting potentially subtle aspects of touch interactions (e.g., the combination of forces of a grip applied to a real tool or a virtual tool superimposed over a real object, or the relative timing of changes of applied forces) associated with various types of applications remains a challenging technical problem.
Apple's granted patent covers various embodiments of methods and apparatus for detecting and analyzing forces applied by individuals using hand-wearable devices (such as gloves) and other types of wearable devices equipped with multi-segment force sensors.
According to some embodiments, a wearable electronic device may comprise one or more multi-segment force sensors as well as at least one signal aggregator. A given multi-segment force sensor may comprise a plurality of sensor segments connected to a flexible substrate material in various embodiments.
The substrate material may be wrapped at least partially around a portion of the body of an individual. For example, in the case of a hand-wearable device, a first portion of the substrate material and a second portion of the substrate material may collectively be wrapped at least partially around a particular finger of the individual, with a first sensor segment attached to the first portion, and a second sensor segment attached to the first portion.
In at least some embodiments, respective multi-segment force sensors may be used for one or more digits (fingers or toes) of a body extremity (a hand or a foot). Individual ones of the segments of a hand-wearable device may be capable of providing respective distinct indications of the magnitudes of forces applied by the individual using the different parts of a finger adjacent to the segments (e.g., when the individual presses a finger against a hard or soft surface, rolls or slides a finger along a surface, or holds an object with a combination of fingers/thumbs).
The sensor segments or elements may be comprised of thin flexible materials in various embodiments. For example, flexible circuitry used for the sense and drive electrodes may in turn be attached to a flexible substrate material (with tactile properties similar to those of cloth or leather), enabling a given multi-segment sensor to be wrapped at least partially around a digit, and providing a comfortable and natural feel to the wearer of the sensor-equipped device in at least some embodiments.
Apple's patent FIG. 1 below illustrates a hand-wearable device equipped with a plurality of multi-segment force sensors whose output may be analyzed at an application processing engine.
(Click on image to Enlarge)
Apple's patent FIG. 2 below illustrates an example system environment in which signals captured from a head mounted device and a pair of hand-wearable devices equipped with multi-segment force sensors may be processed for a mixed-reality application.
(Click on image to Enlarge)
Apple further notes that in some embodiments, world sensors #240 may collect additional information about the user environment (e.g., depth information, lighting information, etc.) in addition to video.
Similarly, in some embodiments, user sensors #250 may collect additional information about the individual such as expressions, face gestures, head movements, etc.
A 3D virtual view #204 may comprise a three-dimensional (3D) space including virtual content #210 at different depths that the individual sees when using the mixed reality system of FIG. 2.
In some embodiments, in the 3D virtual view, the virtual content may be overlaid on or composited in a view of the individual environment with respect to the user's current line of sight that is provided by the HMD #202. The HMD may implement any of various types of virtual reality projection technologies in different embodiments.
For example, the HMD may implement a near-eye VR technique that displays left and right images on screens in front of the individual's eyes that are viewed by a subject, such as techniques using DLP (digital light processing), LCD (liquid crystal display) and LCoS (liquid crystal on silicon) technology VR systems.
As another example, the HMD may comprise a direct retinal projector system that scans left and right images, pixel by pixel, to the subject's eyes. To scan the images, left and right projectors may generate beams that are directed to left and right reflective components (e.g., ellipsoid mirrors) located in front of the individual's eyes; the reflective components may reflect the beams to the eyes. To create a three-dimensional (3D) effect, virtual content 210 at different depths or distances in the 3D virtual view 204 may be shifted left or right in the two images as a function of the triangulation of distance, with nearer objects shifted more than more distant objects.
Apple's patent FIG. 6 below illustrates a bottom view and a side view of a finger of a hand-wearable device equipped with a multi-segment force sensor; FIG. 7 illustrates an example positioning of a control and communication unit with respect to a multi-segment force sensor for a finger of a hand-wearable device.
For more details, review Apple's granted patent 11,009,949.
Comments