On July 1, 2010, the US Patent & Trademark Office published three distinct patents detailing the magic behind Apple's GarageBand, a highly popular iApp within Apple's iLife suite. If you're a musician-techie or programmer who loves music, you'll be able to explore various aspects of today's patents covering an audio processing interface, a method and system for music instruction and of course the music instruction system behind GarageBand. The music instruction patent covers how video instruction is provided for in GarageBand while another patent details various aspects of guitar amplifier effects - like the famous wah-wah pedal effect.
In-person music instruction can be expensive and/or inconvenient because of travel and/or scheduling. Additionally, with group instruction, it can be frustrating to follow a group pace for instruction that may be too fast or too slow for a particular person's skills and abilities. Thus, students, musicians and other music hobbyists are increasingly using computers to improve, expand and strengthen their skills playing a variety of musical instruments. Various conventional computer programs exist to provide musical instruction.
One drawback of conventional music instruction programs is that the displays and/or user interfaces associated with these programs are not intuitive and/or they fail to recreate the visual cues and subtleties that can be critical for learning to play a musical instrument. Another drawback of conventional music instruction programs is that while a user may be able to go through various lessons at his/her own pace, the actual tempo of the music instruction frequently fails to provide adequate flexibility (e.g., accompaniment music may be too fast or too slow or cannot be changed dynamically on the fly). In other words, while these programs may provide convenience and/or cost savings, they ultimately fail to provide the same caliber of instruction that a real person can provide.
Apple's GarageBand comes to the rescue to remedy these problems and inconveniencies.
Patent 20100162878: A song audio is played and a graphical representation of a musical instrument associated with the audio is displayed. For example, a song might include a guitar part; thus, a graphical representation of a guitar (or simply a guitar neck) might be displayed. A fingering display is overlayed on the graphical representation of the instrument during the playing of song. Using the example of a guitar, a fingering of which strings to play with which fingers on which frets is displayed. The strings may also visually vibrate. The fingering display is synchronized to the song audio. During the playing of the song audio, the tempo of the song is adjusted. The pitch of the song is substantially preserved in real-time despite the tempo adjustment. In addition, the synchronization between the fingering display and the audio are maintained in real-time in view of the adjusted tempo.
Patent 20100162880: A graphical view of a file component associated with an audio that is displayed in a graphical user interface (GUI) is automatically switched during playback of the audio. The switching is based on a Musical Instrument Digital Interface (MIDI) view-switching track associated with a project file for the audio. In various embodiments, a musical instrument displayed in the GUI is automatically switched during playback of the audio based on a MIDI instrument-switching track associated with the same project file. Additionally, a metronome beat associated with the audio is automatically switched between on and off during playback of the audio based at least in part on a MIDI metronome-switching track associated with the song audio.
Patent 20100169775: A graphical element resembling an instrument amplifier (e.g., a guitar amplifier) having audio control parameters are displayed through a graphical user interface (GUI). An additional graphical element resembling one or more instrument effects pedals is displayed. Each instrument effects pedal has separate audio control parameters. An audio input is received from an instrument. The audio input is processed serially according to the audio control parameters associated with the one or more instrument effects pedals and the instrument amplifier. The audio resulting from the processing is provided as an output.
The GarageBand System Overview
Apple's patent FIG. 1 is a block diagram illustrating the system behind GarageBand. System 100 includes various components for music instruction. It should be noted however, that the various components could all be included within processor 110 while certain components could be separate from the processor.
Graphical user interface (GUI) 112 noted above, allows a user (e.g., a music student) to interact with the various components of the GarageBand system via display 102 and input/output 104. Various embodiments herein are described using a guitar as an example of a musical instrument used in teaching/learning. However, one of skill in the art will appreciate that other instruments could be used in various embodiments, including, but not limited, to pianos, drums, brass instruments (trumpet, French horn, etc.), reed instruments (saxophone, clarinet, etc.), etc.
Audio module 118 plays an audio of a song as part of a music lesson in various embodiments. The audio may be retrieved, for example, from a memory 106. The audio may be output via the input/output. Video module 122 displays a video of the song in the user interface (112). The video shows an instructor playing (e.g., a guitar) along with the audio. The video could include various views, angles, and/or perspectives of the instructor and/or the guitar.
Instrument module 126 includes a graphical element that is also displayed in the user interface, the graphical element resembling at least a portion of a real musical instrument (e.g., a guitar) 108. For example, the graphical element of instrument module might be a depiction of a fret board (including the strings) of a guitar. A fingering module (120) overlays instrument fingering for the song onto the graphical element of the guitar. For example, the fingering module might overlay highlighted icons (e.g., circles) at certain positions on the guitar fret board to indicate on which string and which fret to place a particular finger to play a note or a chord.
The Instrument module further includes a vibration component 132 that causes the strings on the graphical element of the guitar fret board to visually vibrate when a note or chord is to be played.
The system's GUI also displays musical notation corresponding the song audio based on a music notation module 124. Various music notation formats can be used in different embodiments. Some embodiments may use more than one music notation format at a time. Types of music notation used in various embodiments include, but are not limited to, tabbed notation, chord and/or chord grid notation, and classical notation.
The aforementioned modules and their corresponding displays in GUI 112 are synchronized by synchronization module 114. In embodiments where not all modules are in use (e.g., only audio module 118 and video module 122 are in use) only those modules need to be synchronized. As an example of the synchronization performed by the synchronization module, instrument fingering (from fingering module 120) is synchronized with the song audio from audio module 118 such that the instrument fingerings for various notes and/or chords are displayed in user interface at the same time that those same notes and/or chords are played in the song audio. Extending the example further, the visual vibrations of the guitar strings from vibration component 132 may also be synchronized such that the visual vibrations are displayed at the same time that the notes and/or chords associated with those vibrating strings are played in the song audio. In various embodiments, the synchronization occurs automatically without any user input.
In certain embodiments, the views, angles, perspectives, etc. of the video and instrument displays may change (e.g., dynamically) during the playing of a song. Additionally, the type of music notation being displayed may also change (e.g., between tab, chord, chord grid, classical notation, etc.) during the playing of the song.
The GarageBand System 100 also includes a tempo slider 116 that is displayed in user interface. A user may adjust the tempo of a song being played in real-time using the input/output. For example, if a particular song is recorded/stored at 100 beats per minute (BPMs), but the tempo is too fast for the user (e.g., student) to keep the pace while playing along, the user can adjust the tempo down to, say, 85 BPMs. Some conventional music software provides for tempo adjustment. However, in embodiments described herein, the tempo slider includes a pitch stabilizer 134 to stabilize a pitch of the song audio in real-time in response to a tempo adjustment during playback of a song. Thus, a user can adjust the tempo during playback of a song while maintaining the proper pitch of the song. The synchronization module maintains synchronization of the various synchronized modules in real-time when a tempo adjustment is made.
A tuner 130 is included in the system in various embodiments. The tuner includes a graphical element displayed in the user interface that allows the user to perform various tuning operations. By connecting an external instrument (e.g., guitar) 108 to system, the user could play a note on the instrument and receive tuning feedback via the user interface. For example, if a user plays an E note that is too low in pitch on a guitar, the tuner will indicate that the string needs to be tightened. Similarly, if the note is too high, tuner 130 will indicate that the string needs to be loosened.
Mixer 128 also includes a graphical element displayed in the user interface, in this case allowing a user to apply configurable mixing parameters to the song audio and/or the external instrument audio. Mixing parameters can include, but are not limited to, master level, track levels, bass, treble, mid-range, and various effects (e.g. sustain, echo, etc.), etc.
GarageBand Project Views
Apple's patent FIG. 2 of patent 20100162880 represents a block diagram illustrating a system that executes files. System 200 could be a computer, a device or musical device, an instrument, etc. capable of providing musical instruction. The system includes at least a processor 202, a memory 204, and a display 206. The processor executes a project file 210, which may be retrieved from memory. The project file includes a variety of tracks that facilitate musical instruction, in part, using a graphical user interface (GUI) in conjunction with a display. In various embodiments, the tracks described herein are maintained as Musical Instrument Digital Interface (MIDI) tracks.
The overall project-views that the user could choose may be automated according to a view switching track 212. In other words, the project-view(s) may change automatically during playback of a song. In various embodiments, the movie/video associated with the song is displayed in all project views. However, it is not necessary to display the movie/video in all project-views. In one example, a project-view might include both an animated instrument display and a musical score display. In another example, a project-view might include only the animated instrument display or only the musical score display. Thus, a first project-view might be displayed during one part of a song (e.g., a verse of the song) and a second project view might be displayed during another part of the song (e.g., the chorus of the song). The timing of the automatic switching between project-views during playback of the song is controlled by view switching track 212.
The View-Switching Track: The view-switching track also allows the musical score view to be switched automatically during playback of a song. Controller data on track 212 controls which score view among score tracks 226 will be displayed at which particular time during playback. As discussed previously, score views may include full score, right hand, left hand, simplified versions of full score, right hand, left hand, and various other combinations. Additionally, score views may include song lyrics in various embodiments.
Metronome Track: A metronome associated with the music instruction system may be automated according to a metronome track 214. In general, the metronome is enabled and disabled by the user. However, in some embodiments, the metronome may be switched on or off automatically as controlled by the metronome track 214. A user may still be able to manually override the automated switching of the metronome in various embodiments.
Teacher track: Teacher track 216 provides the needed information for animating the instrument animation display. Teacher track 216 includes, for example, fingering numbers to indicate to the user which fingers to use in playing particular notes/chords. In certain embodiments MIDI channels 1-6 are used to indicate the different strings on a guitar (channel 1 being the lowest string and channel 6 being the highest string). MIDI notes are then used to indicate tab numbers and/or fret positions.
Movie Track: As discussed above, a movie track 218 may be displayed in all project views. Unlike the other tracks (which are maintained as MIDI tracks in various embodiments), movie track 218 is maintained in a separate video file (e.g., a Quicktime.RTM. file) that is linked to project file 210.
Instrument Tracks: An instrument switching track 220 controls the automated switching of instrument tracks 224 within a project during playback of a song. As with other switching tracks, instrument switching track 220 defines the timing for automatic switching and defines which instruments to switch during playback.
Score Track: Each different type of display is stored as a separate score track. Examples of notations used in score tracks include, but are not limited to, full piano, right hand, left hand, piano chords, guitar grids, tablature (TAB), TAB+notation, and guitar chords. Thus, depending on the notation used, the musical score associated with the song is displayed in sync with the song audio during playback. In other words, the timing of the particular notes/chords displayed in musical score display 116 corresponds to the timing of those notes/chords being played in the song audio.
Language Tracks: Project file 110 also includes multiple language tracks 228 in various embodiments. A user may select a language in which to receive the musical instructions. Based on the selected language, one of the language tracks 228 will be played.
Music Instruction System
Apple's patent FIG. 2 of patent 20100162878 shown below is a block diagram illustrating a graphical user interface (GUI) display in a music instruction system according to various embodiments. For a given song (to be taught/learned), GUI 212 provides a variety of displays to facilitate the musical instruction. Video/movie display 214 displays a video of the song. In various embodiments, the video sets the foundation component to the song instruction. The video itself may be maintained in a single video file (e.g., a Quicktime file). In certain embodiments, audio associated with the video is maintained in a separate audio file. The video includes one or more angles/views of an instructor playing the song on an instrument (e.g., guitar, piano, etc.). The video angle(s)/view(s) may change during playback of the song video. In some embodiments, each video angle is maintained in a separate video file.
In embodiments where the video includes multiple views, one view might be, for example, of the right hand playing the instrument (e.g., piano, guitar, etc.) while another view shown simultaneously might be of the left hand playing the instrument. Other combination of views including, but not limited to, body position, instrument views and the like are contemplated in various embodiments.
GUI 212 also includes a musical notation display 216. Musical notation 216 displays the musical notation associated with the song. Each different type of display is stored as a separate notation/score track. Examples of notations used in score tracks include, but are not limited to, full piano, right hand, left hand, piano chords, guitar grids, tablature (TAB), TAB+notation, and guitar chords. Thus, depending on the notation used, the musical notation associated with the song is displayed in sync with the song audio during playback. In other words, the timing of the particular notes/chords displayed in musical notation display 216 corresponds to the timing of those notes/chords being played in the song audio.
Instrument animation 218 displays an animated graphical representation of the musical instrument being practiced/learned. For example, if the music instruction is for playing the piano, then an animated graphical representation of a piano keyboard is displayed. A guitar fret board is another example of an instrument animation that can be displayed. In various embodiments (using the example of the piano), the piano keyboard may be animated to show the keys that are to be played during the playback of the corresponding song audio. The instrument animation display may also include a fingering overlay to illustrate the exact fingering that should be used for particular notes, chords, melodies, etc. In embodiments where the instrument animation is of a string instrument (such as a guitar or bass guitar), the animation may cause the strings to visually vibrate corresponding to the notes in the song audio as though the strings were actually plucked by a user.
Control panel 220 may include a variety of user-selectable options (e.g., play, record, tempo adjust, etc.) related to interacting with the music instruction system. Included in control panel is a metronome 222. Metronome can be turned on and off by a user, who can also adjust the tempo in various embodiments. In addition to the user controls, the metronome may be switched on and off automatically during playback of a song in some embodiments.
Guitar Effects Pedal Module
Apple's patent FIG. 1 of patent 20100169775 shown below represents a block diagram of the GarageBand Guitar effects pedal module 122 which could be one module for multiple guitar effects pedals or it could be multiple modules, each for a different guitar effects pedal. And while embodiments described herein are directed towards guitars, other musical instruments that make use of effects and/or effects pedals (e.g., bass guitars, etc.) could be used according to various embodiments.
The guitar effects pedal module includes graphical element 124 and control parameters 126. Graphical element 124 is displayed by GUI 112 to resemble one or more real-world guitar effects pedals (also known as "stomp boxes"). In various embodiments, graphical element 124 could be displayed with different views. For example, in response to receiving user input 102 to enter a play mode, graphical element could be displayed such that it appears that one or more guitar effects pedals are sitting on a floor (e.g., a stage floor, garage floor, etc.). Other views or arrangements of the guitar effects pedal(s) could be used to visually represent the play mode selected by the user. As used herein, "play mode" refers to a mode of operation of music software that is used while playing music with an instrument that is directly and/or indirectly coupled to a computer system running the music software.
In response to receiving user input to enter an edit mode, graphical element could be displayed such that it appears that the one or more guitar effects pedals are lifted up off the ground (e.g., for the purpose of being adjusted, changed, etc.). Once again, other views or arrangements of the guitar effects pedal(s) could be used to visually represent the edit mode selected by the user. As used herein, "edit mode" refers to a mode of operation of the music software that is used while editing the configuration, control parameters, settings, etc. of the various modules (e.g., guitar effects pedal module 122, guitar amp module 116, etc.).
Control parameters for the guitar effects pedal module facilitate selection of zero or more effects-pedals when the system is in an edit mode. In a play mode, the control parameters directly influence the sound output from the guitar effects pedals. The play mode control parameters could include, but are not limited to distortion, fuzz, overdrive, chorus, reverberation, wah-wah, flanging, phaser and pitch shifting.
The guitar amplifier module 116 also includes a graphical element 118 and control parameters. The graphical element is displayed by the user interface to resemble a real-world guitar amplifier. In various embodiments, the graphical element could be displayed with different views. For example, in response to receiving user input to enter a play mode, a front view of graphical element is displayed to resemble the front of a guitar amplifier. The front view can be a sub-element of graphical element in various embodiments. Other views or arrangements of a guitar amplifier could be used to visually represent the play mode selected by the user in other embodiments.
In response to receiving user input to enter an edit mode, a back view of the graphical element is displayed to resemble the back of the guitar amplifier. Once again, other views or arrangements of the guitar amplifier could be used to visually represent the edit mode selected by the user in other embodiments.
The control parameters for the guitar amplifier module facilitate control of various amp settings when the system is in an edit mode. More than one edit mode could exist in various embodiments. Amp settings may include, but are not limited to, amp model, send amounts for Master Echo and Master Reverb, input source, monitoring settings (e.g., off, on, on with feedback protection, etc.), and recording level. In a play mode, the control parameters directly influence the sound output from the guitar amplifier. The play mode control parameters can include, but are not limited to gain, bass, mid-range, treble, presence, master, output, reverb, tremolo rate, and tremolo depth.
The GarageBand System also includes an audio input module 114 to receive instrument audio 104. For example, the audio input module might include a standard quarter inch analog plug to serve as the connection point for an analog cable connected to an electric guitar, a digital keyboard or other instrument. Other connection plugs (e.g., RCA plugs, mini RCA plugs, microphone plugs, etc.) could be used to receive the instrument audio.
Various aspects of the GarageBand System that were presented in today's report are further detailed in three distinct patent applications: Audio Processing Interface 20100169775, Method and System for Music Instruction Files 20100162880 and Music Instruction System 20100162878. All were originally filed in late December 2008 – or about 18 months ago.
Other Noteworthy Patents Published Today
Four patent applications relating to Apple's Aperture were published today which include the following: Slide Show Effects Style 20100169784, Framework for Slideshow Object 20100169783, Effects Application Based on Object Clustering 20100169389 and Light Table for Editing Digital Media 20100169777. In Apple's Professional website section, they profile Vincent Laforet, a staffer who left the Times to pursue other projects including commercial photography. Aperture has been a key part of this transition. In this profile, Laforet states that "My clients love to use the Light Table feature. We move pictures around and make our selections that way, very much as we had always done until computers make that a problem." Apple's Light Table patent covers this feature and more.
Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for further details. For additional information on any patent reviewed here today, simply feed the individual patent number(s) noted in this report into this search engine. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.