Apple Patent Reveals advancing Siri for Controlling Apps on Apple Devices including their upcoming Vision Pro Headset
Yesterday the US Patent & Trademark Office published a patent application from Apple that relates to enabling Siri to understand a new set of commands for controlling applications like a word processor and more. The invention relates to the upcoming Apple Vision Pro, the iPhone and more.
Apple's Siri may require training to be able to interact with the applications or process the commands to perform one or more tasks. This can be cumbersome and time intensive, creating barriers for developers who wish to integrate their applications with the digital assistant and for users who seek a greater level of access to different tasks with the digital assistant.
Addressing this issue extends to Apple's coming Vision Pro Spatial Computing Headset. Apple notes in patent FIG. 1B below that system #100 includes two (or more) devices in communication. First device #100b (e.g., a base station device) includes processor(s) may be wired or wireless. The second device #100c (e.g., a head-mounted device) includes various components, such as processor(s), RF circuitry, memory, image sensor(s), orientation sensor(s), microphone(s), location sensor(s), speaker(s), display(s) and more.
Apple covers AR/VR/MR in eight paragraphs to emphasize that the invention relating to Siri controlling applications definitely extends to their upcoming Apple Vision Pro.
In Apple's patent FIG. 7 below the Apple Vision Pro may produce a VR environment including one or more virtual objects that Siri may interact with based on user input. In some examples, the headset may generate or receive a view of the virtual environment, including the one or more virtual objects. For example, as shown in FIG. 7, the headset may receive view #700 including a virtual painting #702 and a virtual couch #703.
While interacting with view #700, Siri may receive a spoken input #701 "make the couch blue" that it doesn't recognize. Accordingly, Siri determines whether the command matches an action, sub-action, or at least a portion of the metadata of a link model to determine which action should be performed.
The patent also relates to Siri working with apps on other devices like an iPhone as pictured in patent FIGS. 4 and 6 below. These are examples of input commands to be mapped and executed.
As shown in FIG. 4 above, the system may receive the spoken input #404 "bold the word ‘Hey!’." Siri may process the spoken input to determine that the command is “bold” but may not understand what the command “bold” means or what action to perform based on that command. Accordingly, the system and Siri may determine the action to perform for the command “bold” by accessing a "link interface," as noted in patent FIG. 3 below.
Finer details could be found in Apple's 46 page patent application number 20230206912 titled "Digital Assistant Control of Applications."
Some of the Team Members on this Apple Project
- Cédric Bray: AR/VR Engineering Manager
- Helmut Garstenauer: Senior AR/VR Software Engineer
- Tim Oriol: Senior Software Engineering Manager, Technology Development Group
- Kurt Piersol: Lead Engineer
- Jessica Peck: Conversation Designer
- Luca Simonelli: Software Engineering Manager
- Nathan Taylor: App Intents Engineering Manager
Comments