Intel to introduce a new Thermal Module Design for Notebooks at CES in Vegas starting on January 7
Apple Invents a Head Display Alert System that notifies the user on-screen that someone in the room requires their attention

Future Apple HMD Device will use an Interaction Interpreter that understands Voice Commands, Hand Gestures & more

1 Cover Apple HMD MR  CONTROL MODES

 

Today the US Patent & Trademark Office published a patent application from Apple that relates to Apple's future head-mounted display system that will allow users to interact with objects in many modes of operation including voice commands, hand gestures and more.  

 

Apple's invention includes devices, systems and methods that provide a computer-generated reality (CGR) environment in which virtual objects from one or more apps are included.

 

User interactions with the virtual objects are detected and interpreted by an event system that is separate from the apps that provide the virtual objects. The event system detects user interactions received via one or more input modalities and interprets those user interactions as events.

 

Such input modalities include text input detected via keyboard, cursor input detected via mouse or trackpad, touch input detected via touch screen, voice input detected via microphone, gaze/eye-based input detected via light or IR sensors, and hand/movement-based input detected via light or IR sensors.

 

The head-mounted device will be able to interact with Macs, desktop or MacBook.

 

Apple's patent FIG. 2 below is a block diagram of the device of FIG. 1 including multiple apps that provide virtual objects to the event system for inclusion in the CGR environment.

 

2 hmd system  apple patent

 

Apple's patent FIG. 4 above is a block diagram of the event system of FIG. 2 detecting a user interaction to create an action event and change a view of the CGR environment.

 

More specifically, Apple's Head-Mounted Display system can operate in different interactive modes in the virtual environment. The user (#5) will be able to use hand gestures to move an item in a VR scene from one place to another; the second mode available to the user is voice commands, such as "move the table to this location"; and the third mode is touch input.

 

Beyond those main modes, the user will be able to use a keyboard (physical or virtual), mouse, joystick, button or dial to interact with a given scene.

 

Some of the sensors of this system will include an inertial measurement unit (IMU), an accelerometer, a magnetometer, a gyroscope, a thermometer, one or more physiological sensors (e.g., blood pressure monitor, heart rate monitor, blood oxygen sensor, blood glucose sensor, etc.), one or more image sensors, one or more microphones, one or more speakers, a haptics engine, one or more depth sensors (e.g., a structured light, a time-of-flight, or the like), or the like. The one or more image sensors can include one or more RGB cameras (e.g., with a complimentary metal-oxide-semiconductor (CMOS) image sensor or a charge-coupled device (CCD) image sensor), monochrome cameras, IR cameras, event-based cameras, or the like.

 

Apple's patent application '0391726 that was published today by the U.S. Patent Office was filed back in Q2 2019 with some work dating back to Q2 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time. The depth of Apple's patent base for future smartglasses and Head-Mounted Displays can be viewed in our Special Archives

 

Some of Apple's Inventors

 

Sam Iglesias: Senior Software Engineer, AR/VR

Edwin Iskandar: Software Engineer Manager

Tim Oriol: Software Engineering Manager, Technology Development Group

 

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.