A European patent filing from Apple describes a user being able to use a future MacBook Keyboard in both Type and Touch modes
An Apple patent filed in European earlier this month described systems, methods, and computer-readable media to utilize a physical keyboard in multiple input modes, including a typing mode and a tracking mode, without the need for a specialized keyboard.
Vision-based hand tracking may be utilized to detect whether user input should be determined based on a typing mode or a tracking mode. In a typing mode, the visual information is ignored, and input is received as intended by the physical keyboard. For example, a computer system may receive user input based on keys depressed on the physical keyboard and the like. In a tracking mode, visual information of a hand in relation to a keyboard may be utilized to obtain user input.
According to one or more embodiments, a tracking mode may be triggered based on a user gesture detected by visual means or by user input to the physical keyboard. For example, a particular key or combination of keys on the keyboard may trigger the tracking mode. The key or keys may be physically pressed or determined to be selected based on visual means by tracking a location of the hand in relation to the key or combination of keys. Further, a particular gesture may trigger the tracking mode, such as a touch or swipe on a predetermined part of the physical keyboard.
In Apple's patent FIG. 1 below we're shown an example system setup #100 in which techniques for utilizing a tracking mode on a physical keyboard are employed. The user 105 may utilize the physical keyboard #145 with a touching object, such as the user's finger #125. Based on the configuration of the touching object (e.g., the user's finger), a determination may be made as to whether the physical keyboard should be utilized in a typing mode or a tracking mode.
A spatial relationship may be determined between the user's hand, such as a finger, and the physical keyboard using image data and other visual information. For example, one or more cameras #180 may be used to obtain image data that includes the user's finger and the physical keyboard. Additional information may be obtained by other sensors #165, such as a depth sensor (camera). The visual data (such as image data and depth data) may be used to determine whether contact occurs between the finger and the physical keyboard.
The user may perform a predetermined movement, such as a swiping action or other gesture #130. The movement may be associated with a touch location #120 (touch bar) on the physical keyboard.
In some embodiments, the gesture and/or the touch location may be utilized to determine whether a tracking criterion is satisfied. The tracking criterion may be a set of characteristics that indicate that the physical keyboard should be used in tracking mode.
Data that may be utilized to determine whether tracking criteria is satisfied may include, for example, a particular finger or part of the hand that makes contact with the physical keyboard, detection of a predetermined gesture, and the like.
In some embodiments, the tracking criteria may include non-vision-based information. For example, a particular physical key on the keyboard 145 or combination of keys, when pressed, may trigger initiation of the tracking mode.
Apple's European patent application was published in early February. The lead inventor noted on the patent is Senior Computer Vision Engineer, Michele Stoppa.
It should be noted that Apple engineers first thought of this idea way back in 2024 in context with an iPad Magic Keyboard. Apple's 2014 patent FIG. 2A below, notes that a user may provide input to iPad by striking the smooth surface overlay (#110) above a particular key and provide input to the iPad by sliding gestures.