Apple Previews their new 'Apple Hongdae' Store opening this Saturday in Seoul, S. Korea
Apple releases a new trailer for the documentary 'Messi’s World Cup: The Rise of a Legend,' cancels 'Schmigadoon!' & more

A Major Apple Patent reviews Apple Vision Pro Hardware, its use of Air Gestures and Enrolling Personalized Accessories

1 cover - Apple Vision Pro GRAPHIC

Today the U.S. Patent and Trademark Office officially published a patent application from Apple that titled "Devices, Methods, and Graphical User Interfaces for User Authentication and Device Management." The patent is basically divided into three segments: Hardware, In-Air Gestures and Enrolling Personalized Accessories. The patent also touches on a "guest mode" for Vision Pro that provides the guest with limited access to features and locks out any personal data about the owner.

Apple notes in their patent that while a computer system, like Apple Vision Pro, is in a locked state, the computer system performs a first authentication of a user (e.g., a biometric authentication of the user and/or non-biometric authentication of the user (e.g., passcode and/or password-based authentication)).

If the authentication of the user fails, the computer system determines whether guest mode criteria are satisfied. If guest mode criteria are satisfied, the computer system displays an option that is selectable by a user to operate the computer system in a guest mode.

In some embodiments, the guest mode represents a restricted user experience that allows the user to user the computer system but with fewer features and/or functions available. If the guest mode criteria are not satisfied, the computer system does not display the option to operate the computer system in the guest mode. By selectively displaying the guest mode option only when guest mode criteria are satisfied, the computer system prevents unauthorized users from accessing sensitive data.

Apple filed for this patent in September 2023, three months after introducing it at WWDC23. They didn't want this patent to surface prior to WWDC23 and spoil the surprise.

The patent figures below are just a few of the detailed schematics relating to the Vision Pro Spatial Computer hardware.

2 Apple Vision Pro patent figs

In some embodiments, the one or more I/O devices and sensors include at least one of an inertial measurement unit (IMU), an accelerometer, a gyroscope, a thermometer, one or more physiological sensors (e.g., blood pressure monitor, heart rate monitor, blood oxygen sensor, blood glucose sensor, etc.), one or more microphones, one or more speakers, a haptics engine, one or more depth sensors (e.g., a structured light, a time-of-flight, or the like), and/or the like.

To review the details of the hardware, review Apple's patent application via the link provided for you at the end of this report.

Part Two: In-Air Gestures

3.  In-Air Gestures Apple Vision Pro

In this second key aspect of the patent, Apple reviews the use of in-air gestures that controls the visionOS UI. Apple notes that an air gesture is a gesture that is detected without the user touching an input element that is part of a device and is based on detected motion of a portion (e.g., the head, one or more arms, one or more hands, one or more fingers, and/or one or more legs) of the user's body through the air including motion of the user's body relative to an absolute reference (e.g., an angle of the user's arm relative to the ground or a distance of the user's hand relative to the ground), relative to another portion of the user's body (e.g., movement of a hand of the user relative to a shoulder of the user, movement of one hand of the user relative to another hand of the user, and/or movement of a finger of the user relative to another finger or portion of a hand of the user), and/or absolute motion of a portion of the user's body (e.g., a tap gesture that includes movement of a hand in a predetermined pose by a predetermined amount and/or speed, or a shake gesture that includes a predetermined speed or amount of rotation of a portion of the user's body).

In some embodiments, input gestures used in the various examples and embodiments described, include air gestures performed by movement of the user's finger(s) relative to other finger(s) (or part(s) of the user's hand) for interacting with an XR environment (e.g., a virtual or mixed-reality environment).

In some embodiments in which the input gesture is an air gesture (e.g., in the absence of physical contact with an input device that provides the computer system with information about which user interface element is the target of the user input, such as contact with a user interface element displayed on a touchscreen, or contact with a mouse or trackpad to move a cursor to the user interface element), the gesture takes into account the user's attention (e.g., gaze) to determine the target of the user input (e.g., for direct inputs, as described below). Thus, in implementations involving air gestures, the input gesture is, for example, detected attention (e.g., gaze) toward the user interface element in combination (e.g., concurrent) with movement of a user's finger(s) and/or hands to perform a pinch and/or tap input,

For direct input gesture, the user is enabled to direct the user's input to the user interface object by initiating the gesture at, or near, a position corresponding to the displayed position of the user interface object (e.g., within 0.5 cm, 1 cm, 5 cm, or a distance between 0-5 cm, as measured from an outer edge of the option or a center portion of the option).

For an indirect input gesture, the user is enabled to direct the user's input to the user interface object by paying attention to the user interface object (e.g., by gazing at the user interface object) and, while paying attention to the option, the user initiates the input gesture (e.g., at any position that is detectable by the computer system) (e.g., at a position that does not correspond to the displayed position of the user interface object).

In some embodiments, input gestures (e.g., air gestures) used in the various examples and embodiments described herein include pinch inputs and tap inputs, for interacting with a virtual or mixed-reality environment, in accordance with some embodiments. For example, the pinch inputs and tap inputs described below are performed as air gestures.

In some embodiments, a pinch input is part of an air gesture that includes one or more of: a pinch gesture, a long pinch gesture, a pinch and drag gesture, or a double pinch gesture. For example, a pinch gesture that is an air gesture includes movement of two or more fingers of a hand to make contact with one another, that is, optionally, followed by an immediate (e.g., within 0-1 seconds) break in contact from each other.

A long pinch gesture that is an air gesture includes movement of two or more fingers of a hand to make contact with one another for at least a threshold amount of time (e.g., at least 1 second), before detecting a break in contact with one another. For example, a long pinch gesture includes the user holding a pinch gesture (e.g., with the two or more fingers making contact), and the long pinch gesture continues until a break in contact between the two or more fingers is detected.

In some embodiments, a double pinch gesture that is an air gesture comprises two (e.g., or more) pinch inputs (e.g., performed by the same hand) detected in immediate (e.g., within a predefined time period) succession of each other. For example, the user performs a first pinch input (e.g., a pinch input or a long pinch input), releases the first pinch input (e.g., breaks contact between the two or more fingers), and performs a second pinch input within a predefined time period (e.g., within 1 second or within 2 seconds) after releasing the first pinch input.

In some embodiments, a pinch and drag gesture that is an air gesture includes a pinch gesture (e.g., a pinch gesture or a long pinch gesture) performed in conjunction with (e.g., followed by) a drag input that changes a position of the user's hand from a first position (e.g., a start position of the drag) to a second position (e.g., an end position of the drag).

In some embodiments, the user maintains the pinch gesture while performing the drag input, and releases the pinch gesture (e.g., opens their two or more fingers) to end the drag gesture (e.g., at the second position). In some embodiments, the pinch input and the drag input are performed by the same hand (e.g., the user pinches two or more fingers to make contact with one another and moves the same hand to the second position in the air with the drag gesture).

In some embodiments, the pinch input is performed by a first hand of the user and the drag input is performed by the second hand of the user (e.g., the user's second hand moves from the first position to the second position in the air while the user continues the pinch input with the user's first hand). In some embodiments, an input gesture that is an air gesture includes inputs (e.g., pinch and/or tap inputs) performed using both of the user's two hands. For example, the input gesture includes two (e.g., or more) pinch inputs performed in conjunction with (e.g., concurrently with, or within a predefined time period of) each other. For example, a first pinch gesture performed using a first hand of the user (e.g., a pinch input, a long pinch input, or a pinch and drag input), and, in conjunction with performing the pinch input using the first hand, performing a second pinch input using the other hand (e.g., the second hand of the user's two hands).

The patent also goes on to describe two-hand gestures and the grab gesture.

Part Three: Enrolling Personalized Accessories

The third aspect of this patent relates to a computer system, in this first case Vision Pro, that determines whether a personalized accessory is connected to the computer system, and if a personalized accessory is connected, the computer system further determines whether the computer system has biometric enrollment data for the personalized accessory.

In patent FIGS. 9M, 11H and 15L below, we see the user looking through Apple Vision Pro display and following the set up process for an accessory device to work with Vision Pro.

4. Apple patent figs for HMD set up personal accessory scan system

In the patent figures below, Apple illustrates another round of views for setting up an accessory device. In these figures, Apple notes that "in FIG. 11B, electronic device #1104 determines that it is being worn by a user. In the depicted embodiments, electronic device is a wearable smartwatch device, and it is worn on the wrist of the user. In other embodiments, the same electronic device #1104 is a head-mounted system. 

An interesting admission was made in today's patent regarding to a future Apple Watch. Apple points to feature #1107C presented in patent FIGS. 11B & C below, the digital crown, as doubling as a camera. That was covered in a Patently Apple 2022 patent report.  

5. Digital Crown camera patent

6. Apple Patent figs for setting up personal accessories

In some embodiments, Apple notes that the secure information includes personalized accessory information (e.g., information pertaining to one or more prescription optical lenses (e.g., one or more prescription optical lenses associated with a user of the computer system) and/or one or more personalized controllers or input devices). Displaying information that enables the companion device to retrieve personalized accessory information from the computer system allows for setup of the companion device with fewer user inputs, thereby reducing the number of inputs required to perform an operation.

In some embodiments, the secure information includes biometric information corresponding to one or more users of the computer system (e.g., one or more users registered on the computer system) (e.g., height, eye color, biometric authentication information (e.g., facial scan information, iris scan information, and/or fingerprint scan information)). Displaying information that enables the companion device to retrieve biometric information for one or more users from the computer system allows for setup of the companion device with fewer user inputs, thereby reducing the number of inputs required to perform an operation.

Apple's patent is extremely detailed with a great number of scenarios that are simply too long and complicated to present in this report. For engineers, developers and tech geeks that would like to review the patent in full, review Apple's patent application  20240020371.

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.