Apple invents a new MacBook & Magic Mouse to support Engineers & Designers create virtual 3D Virtual Objects & Games
Today the U.S. Patent and Trademark Office officially published a patent application from Apple that relates to a new input device / next-gen Magic Mouse designed to allow users to work naturally in creating and manipulating 3D work on screen. Apple notes that the input device can manipulate 3D objects in CAD software, in gaming and more. Now that Vision Pro is here, giving users / developers the ability to create and manipulate 3D Objects on a next-gen MacBook and Magic Mouse is practically a must.
In Apple's patent background Recent advances in computing have enabled immersive user experiences including desktop gaming on personal computers, alternate and virtual reality interactive consoles, three-dimensional (3D) computer-aided design (CAD) software, high-resolution display screens, and so forth. However, the user input devices designed to enable users to manipulate and control displayed objects and visual elements of such systems, including objects represented three-dimensionally on display screens, are limited to input devices such as buttons and knobs that are not intuitive or reflective of actions being signaled by the user.
For example, in CAD software used by engineers and designers to build virtual 3D objects, typical input devices such as computer mice and styluses only provide buttons, knobs, and two-dimensional (2D) position sensing to enable manipulation of the objects being designed. Users often find it unintuitive and difficult to manipulate objects in 3D environments using these 2D input devices for 3D space, which requires additional control functionalities for 3D object translation, zoom, rotating, slicing, and otherwise moving the object in 3D. The limited input and control capabilities of present input devices are therefore inefficient, difficult to learn, burdensome, and insufficient for 3D manipulation. For these and other reasons, there is a persistent need for improvements to 3D input devices.
Apple's patent application covers input devices that can include an input sensor, a housing defining an internal volume, an inertial measurement unit (IMU) sensor disposed in the internal volume, and an ultrasonic speaker disposed in the internal volume. Below are a series of key features:
- In one example, the input sensor can include a touch detection sensor.
- In one example, the ultrasonic speaker is configured to output sound waves greater than about 20 kHz.
- In one example, the ultrasonic speaker is configured to output sound waves between about 20 kHz and about 80 kHz.
- In one example, the input device further includes a feedback module.
- In one example, the feedback module includes a haptic engine.
- In one example, the feedback module includes a light.
- In one example, the input device further includes an emitter electrically coupled to the IMU sensor.
- In one example, the emitter is configured to send signals comprising information regarding a motion or an orientation of the input device detected by the IMU sensor.
- In at least one example a tracking device includes a display portion secured to a base, the display portion having a display screen, an array of ultrasonic microphones disposed on the display portion, and a sensor (e.g., an IMU or angle sensor) disposed on the display portion and configured to detect an angle of the display screen relative to the base.
- In one example, the ultrasonic microphone array includes three microphones defining a first plane and a fourth microphone disposed out of the first plane.
- In one example, the display screen defines a second plane parallel to the first plane.
- In one example, the display portion is a first portion and the tracking device further includes a second portion rotatably secured to the first portion and the fourth microphone is disposed on the second portion.
- In one example, the second portion includes a keyboard. In one example, the angle includes the angle of the display screen relative to a major plane of the second portion.
- In at least one example, a three-dimensional (3D) control system includes an input device, a computing device, and a tracking assembly. The input device can include an input sensor, an inertial measurement unit (IMU) sensor, and an ultrasonic speaker. The tracking assembly can include three ultrasonic microphones fixed to the computing device, the three ultrasonic microphones configured to receive ultrasonic waves output by the ultrasonic speaker.
- In one example, the three-dimensional control system includes a display portion having a display screen.
- In one example, the IMU sensor is a first IMU sensor and the 3D control system further includes a second IMU sensor secured to the display portion.
- In one example, the tracking assembly includes at least four ultrasonic microphones, a first microphone, a second microphone, and a third microphone of the four ultrasonic microphones define a first plane, and a fourth microphone of the four ultrasonic microphones of the four ultrasonic microphones is disposed out of the first plane.
- In one example, the input device includes an emitter configured to send signals including information regarding motion detected by the IMU sensor to the computing device.
- In one example, the three dimensional-control system includes a laptop computer and the input device is operable as a mouse for the laptop computer.
Apple's patent FIG. 5 below shows a perspective view of an example of a tracking device; FIG. 7 shows a perspective view of an example of a computing device and a tracking assembly fixed to the computing device; FIG. 8 shows an example of an input device detected in 3D space by a computing device as the input device controls a visual object on the display screen of the computing device; FIG. 11 shows a top, perspective view of an example of an input device.''
Apple's patent FIG. 12 above shows a cross-sectional view of an example of an input device; FIG. 13B shows a top view of the input device shown in FIG. 13A with contact regions indicating where portions of the user's hand contact the input device; FIG. 14B shows a user controlling a visual object on a display by manipulating the input device above the support surface in 3D space.
In addition, the input sensor or sensor array of the input device can be used to detect hand positions, squeezing forces, or other gestures performed by the user with the input device to expand the control capabilities of the input device when controlling the object on the screen. In one example, the input sensor can detect a magnitude of force with which the user squeezes the input device. When such a force passes a predetermined threshold, the computing device can then begin to manipulate the visual object on the screen as the user manipulates the input device in 3D space. Before this threshold is met, the computing device can ignore the position and orientation of the input device. In this way, the user can decide to “grab” the visual object (using the input device as proxy) by squeezing the input device. This squeezing is akin to the natural action one would take to actually grab the virtual object displayed on the screen. In this way, the action of grabbing and then manipulating the object on the screen is done naturally and intuitively with a physical input device in the hand of the user.
Additional actions or gestures performed by the user with the input device can also be detected and used to manipulate the object presented on a screen of the computing device in natural and intuitive ways. For example, the user can move the device closer to or further away from the display screen of the computing device to zoom in and out of the displayed object. Other gestures, hand positions, or actions performed with the input device can control the visual object in other ways, including panning left and right, selecting and deselecting objects, and any other useful 3D manipulation control of a 3D object represented on a screen.
Last week Patently Apple posted a report titled "Apple invents a next-gen Magic Mouse with Variable Friction and Multi-Texture capabilities which could excellent for high-end Gaming+." This morning we posted another input device patent report titled " Apple has invented a Vision Pro Input Device Specifically for Creating Virtual Art."
One of the key inventors of this patent is John Morrell, Director, Engineering who came to Apple from Yale and one of the key engineers behind the Segway. His participation on this project would make it a serious one.
A second patent application (20240103643) related to this subject matter could be reviewed here and a third (#20240103656) here.
Comments