An Apple patent reveals a possible future HMD training app teaching users how to navigate a next-gen UI with eye tracking
Yesterday the US Patent & Trademark Office published a patent application from Apple that relates to a gaze tracking testing process to assist users in learning a next-gen UI associated with a future Mixed Reality Headset and other Apple devices over time.
For many adults first learning how to use a mouse with a Mac back in the 1980's, there was a learning curve because most never used a GUI before. So, when Apple releases a future Mixed Reality Headset with eye or gaze tracking to manipulate an on-screen UI, it won't seem natural at first. So, Apple is patenting a testing program to assist users get comfortable with this new methodology of controlling a menu with a series of trial lessons. Over time, eye tracking could also apply to other devices like an iPhone, iPad, Macs and more.
In some implementations, the device evaluates physiological responses of a user expecting interface feedback following an interaction with the object in a user interface.
The physiological responses may be associated with the user expecting the interface feedback using multiple alternative interface feedback characteristics. In one example, it is desirable to predict a user's intention to select a button prior to the user selecting the button based on the user's pupil dilation when the user intends to select the button.
To facilitate such predictions, it may be desirable to configure the button, including feedback associated with selection of the button, to provide significant or optimal pre-selection pupil dilation change, e.g., determining that the user's preselection pupil dilation is good or optimal when the user intends to make the selection and expects certain feedback after the selection.
To do so, the device may evaluate various alternative feedback characteristics (e.g., which of multiple colors a button should change to following a user's click of the button) to identify characteristics of the feedback as suitable or optimal with respect to pre-selection pupil dilation. The device may assess how much a user's eye dilates before he clicks an object when the user expects each interface feedback characteristic (e.g., the object turning pink after being clicked, the object turning orange after being clicked, etc.).
In some implementations, the device then selects an interface feedback characteristic of the alternative interface feedback characteristics based on evaluating the physiological data. For example, the device may select pink as the ideal interface feedback characteristic based on determining that the user's eye dilates more when expecting the object to turn pink than when expecting the object to turn orange.
Apple's patent FIG. 4 below is a flowchart representation of a method for selecting and displaying interface feedback characteristics to enhance physiological responses associated with a user expecting the selected interface feedback characteristics.
Apple's patent FIG. 5 above illustrates a training process where multiple alternative interface feedbacks are assessed to select an interface feedback. In some implementations, the interface feedback is selected based on an evaluation of the physiological response of the user to the presentation of an object or stimulus (e.g., a button) when the user expects to be presented with the interface feedback upon interacting with the object or stimulus (e.g., clicking the button).
In some implementations, a training process including multiple trials for each interface feedback condition is performed. For example, the training process may include identifying an ideal color to change the button after the user clicks it. Here, from Trial 1 to Trial N, the interface feedback condition of the button may be a color change from an initial color (e.g., solid blue) to another color (e.g., solid pink).
As the user interacts with (e.g., clicks) the object (e.g., the button) in each trial (e.g., Trial 1 through Trial N), the user repeatedly experiences the interface feedback condition and develops an expectation of being presented with the interface feedback condition.
Of course, prior to actually seeing this next-gen user interface and its means of operation, it sounds almost as crazy as when someone first explained what a mouse would do before a user even knew what a Graphical User Interface was.
Yet, for those wanting to drill down into Apple's thinking about this testing process in more detail, check out Apple's patent application 20210349536 titled "Biofeedback method of Modulating Digital Content to Invoke Greater Pupil Radius Response."
The Listed Inventors
Sterling Crispin: Neurotechnology Prototyping Researcher.
The news of Apple hiring Crispin was highlighted in a Variety article back in mid-2018. The report noted that Apple hired the developer behind "Cyber Paint App" for VR Headsets. More than likely Apple took that product off the market after hiring Crispin. Crispin previously worked for DAQRI, a Los Angeles-based maker of augmented reality (AR) solutions for industry applications that was too ahead of the market and its supporting technology.
Izzet Burak Yildiz: Sr. Research Scientist
Grant Mulliken: Senior Manager, Technology Development Group
Mulilken's Specialties include: Systems and Computational Neuroscience: signal processing, machine learning, algorithm design, spiking neuron modeling, audio coding and acoustics; Cortical brain-machine interfaces, multi-electrode neurophysiology, and medical device design; Mixed signal VLSI circuit design, bioinstrumentation and neuromorphic engineering.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.