Apple Wins a Patent for a Future Version of GarageBand Designed for Virtual Reality
The U.S. Patent and Trademark Office officially published a series of 63 newly granted patents for Apple Inc. today. In this particular report we cover Apple's patent covering devices, systems, and methods for predictive quantization of user interaction with a virtual musical instrument in computer-generated reality (CGR) environments. Think of it as GarageBand entering the Virtual Age. This will be a dream come true for Air Guitarists.
Music processing systems (such as music processing systems in CGR environments) should ideally strive to improve temporal precision in musical performances in order to enhance music quality and user experience. This task, known as quantization.
Apple's granted patent covers devices, systems, and methods for predictive quantization of user interaction with a virtual musical instrument in computer-generated reality (CGR) environments.
According to some implementations, the method is performed by a device with one or more processors, non-transitory memory, and one or more user interaction hardware components configured to enable a user to play a virtual instrument in a CGR environment.
The method also includes: obtaining user movement information, wherein the user movement information characterizes real-world body pose and trajectory information of the user; generating, from real-world user movement information and a predetermined placement of the virtual instrument in the CGR environment, a predicted virtual instrument interaction time for a virtual instrument interaction prior to the virtual instrument interaction occurring; determining whether or not the predicted virtual instrument interaction time falls within an acceptable temporal range around one of a plurality of temporal sound markers; and in response to determining that the predicted virtual instrument interaction time falls within the acceptable temporal range around a particular temporal sound marker of the plurality of temporal sound markers, quantizing the virtual instrument interaction by presenting play of the virtual instrument to match the particular temporal sound marker of the plurality of temporal sound markers.
Apple's patent FIG. 1 below is a block diagram of an example operating environment.
Further details regarding patent FIG. 1: In some implementations, user movement information includes data determined using images captured by image sensors (e.g., within the HMD (#120) in FIG. 1, and/or within the hand-held devices 130A and 130B in FIG. 1) and/or data determined using output of IMUs, gyroscopes, accelerometers, torque meter, force meters, and/or other sensors (e.g., within the HMD 120 in FIG. 1, and/or within the hand-held devices 130A and 130B in FIG. 1).
For example, the user movement information includes data characterizing the force of impact, angle of impact, and position of impact of the user movement relative to virtual instrument (e.g., drum head, keyboard key, guitar string, or the like) from the IMUs, gyroscopes, accelerometers, torque meter, force meters, and/or other sensors of the hand-held devices 130A and 130B in FIG. 1.
In this example, the device uses the motion and predicted collision to determine characteristics of the note to be played. As such, for example, if the user hits a virtual cymbal at high speed, the device plays a louder note based on the predicted speed or force of impact.
Or, as another example, if the device predicts that the user will hit a certain portion of the virtual cymbal or that the user's finger will hit a particular key on a virtual piano, the device plays a different sound or plays a different note based on the predicted position or angle of impact.
Apple's patent FIG. 7 below is a flowchart representation of a method of presenting user play of a virtual musical instrument.
Apple's granted patent 10,782,779 was originally filed in Q3 2019 and published today by the US Patent and Trademark Office. To dive deeper into the patent's details, review the patent here.
Comments