Microsoft's hardware division has grown slowly beginning with their Surface Tablet line-up with a keyboard-protective case. They moved on with their inventive Surface Book with a detachable display that acts as a standalone tablet. They've since delivered a series of new notebooks and even introduced noise cancellation over-the-ear headphones. Then came the big shocker earlier this month when Microsoft provided introduced Surface Neo, the foldable tablet, and Surface Duo, their foldable smartphone that will be coming to market in Q4 2020.
The leap to foldable devices ahead of Apple showed that they wanted to get out in front and lead the market, not just follow it. It showed that this wasn't some kind of experiment with Microsoft that would eventually die. Microsoft is doing the research and are now willing to assist their ecosystem partners like HP, Dell, Asus, Lenovo and others break into new categories first and ahead of Apple. It's a new day.
In a recent interview with Marques Brownlee, Microsoft's CEO Satya Nadella quoted the famed Alan Kay who worked at HP and later joined the Xerox PARC research staff in Palo Alto, California. Throughout the decade, Alan had developed prototypes of networked workstations using the programming language Smalltalk. Those inventions were later commercialized by Apple in their Lisa and Macintosh computers.
Satya stated: "I love that Alan Kay quote that says "If you're serious about your software you will take your hardware seriously." This is something that Microsoft never really committed to until Satya became CEO.
Satya added that "For us, really going that extra mile, all the way to ensuring from the silicon to the cloud things are coming together around the experience has been a very important thing for us." The video below is cued up to start just as Satya makes the quotes we've mentioned.
Beyond foldable devices, Microsoft showed us at their special event that they're trying to follow Apple's footsteps by finally entering the silicon engineering game with partners like AMD and Qualcomm so as to give their future mobile devices a slightly different experience than other Android devices.
In that vein of wanting to get ahead of the competition, earlier this month the US Patent & Trademark Office published a patent application from Microsoft that reveals that they've invented what could be a next-gen input device for PC's that goes far beyond the two-dimensions of a mouse to that of an input device capable of delivering six degrees of freedom (6DOF).
Microsoft begins their patent by discussing what's available what could be. They note that input devices may facilitate different types of user interaction with a computing device. As examples, two-dimensional translation of a computer mouse across a surface may cause two-dimensional translation of a cursor on a display, while a handheld controller equipped with an inertial measurement unit may provide three-dimensional input as the controller is manipulated throughout space.
These and other existing input devices may present a variety of issues. First, a typical input device has a form factor that lends itself to being held or otherwise manipulated in particular ways.
Other ways of manipulating the input device may be cumbersome or awkward, and when considered with the constrained nature of human wrist and arm movement, this can limit use of the input device.
Second, typical input devices do not support easy/effective transitions among different paradigms of user interaction. This can hinder or prevent multi-modal interaction, further limiting the usefulness of the input device.
As one example of multi-modal interaction, a user may want to animate a graphical object in three dimensions--itself often a challenging task due to the limitations of existing input devices and the two-dimensional nature of graphical output representing the object and its animation--as well as the ability to supply two-dimensional input to a graphical user interface.
Users are increasingly interested in dynamically engaging in different paradigms of user interaction, particularly as mixed reality and other emerging computing experiences that involve three-dimensional content gain prominence.
Microsoft's invention relates to six degrees-of-freedom (DOF) input device. Output from the input device may be used in controlling an application in a first mode and second mode. In the first mode, each of the six degrees-of-freedom sensed by the input device may be used to control the application, whereas in the second mode, one or more of the six degrees-of-freedom may not be used to control the application. These and other modes may be switched among in response to detecting various conditions.
Input devices revealed may be conducive to manipulation with a greater variety of orientations and motions, supporting natural and expressive movement throughout space, to better enable various paradigms of user interaction.
Further, examples revealed in Microsoft's patent application facilitate various types of multi-modal user interaction which may include (1) translational and/or rotational three-dimensional manipulation of an input device throughout space, (2) two-dimensional translation of an input device across a surface, two-dimensional input applied to a touch-sensitive surface, (3) two-dimensional input applied to a graphical user interface, (4) single axis rotation, and/or (5) gestural input applied to an input device, among others.
Further, the input device and supporting components may be configured to enable seamless switching among these modes to enable dynamic changes in user interaction. Additional examples are described herein that combine multiple input devices to enhance and/or refine user interaction.
Microsoft's patent Figures 1A-1D below illustrate various example modes of user interaction carried out between an input device and a computing device.
Sensor System – Hand Gestures
Microsoft's patent FIG. 2 below is a block diagram of an example input device (#200). The input device (#100) of FIGS. 1A-1D above may implement at least some of the components of the input device. The input device may include a sensor system (#202) for sensing manipulation of the input device in physical space.
The sensor system may sense motion of input device #200 with six degrees-of-freedom: three degrees of translational freedom (e.g., along orthogonal x, y, z axes) and three degrees of rotational freedom (e.g., about orthogonal x, y, z axes).
The sensor system may assume any suitable form, such as that of an inertial measurement unit (IMU), and may include one or more of an accelerometer, gyroscope, and magnetometer. As another example, sensor system may perform six DOF sensing based on alternating current electromagnetic sensing technology.
The input device #200 is operable to sense gestural input. The sensor system may include a gesture sensor for sensing such gestural input. The gesture sensor may include capacitive, resistive, optical, acoustic, and/or any other suitable sensing technologies.
In some examples, the input device #100 may be used in conjunction with hand gestures in interacting with computing device #102. In the examples above 1A-1D the computing device is a desktop.
To this end, FIG. 1D depicts an example in which hand #108 performs a pinching hand gesture in relation to input device 100 (which may be held with another hand not shown in FIG. 1D).
The hand gesture is started at an initial location #152 proximate to the surface of input device #100. While retaining the pinched posture, the user's hand moves in the positive y direction to a final location #154 away from the input device. A virtual object #156 rendered on the display (#104) is controlled in response to this pinching hand gesture.
As indicated at #158, FIG. 1D shows the virtual object n an initial state prior to the performance of the hand gesture, in which the virtual object exhibits a circular geometry. In response to the hand gesture, the virtual object is extruded to a magnitude proportional to that of the hand gesture, as indicated at #160.
Any suitable gesture may be performed in relation to input device #100, in response to which computing device #102 may take any suitable action. A gesture may be a one, two, or three-dimensional gesture.
As another example, the input device (#100) may be used as a proxy for controlling the three-dimensional location and orientation of a virtual object, and hand gestures performed within a threshold distance of the input device may effect various actions applied to the virtual object.
Further, touch input applied to the surface of the input device (#100) may be used as input to the computing device (desktop #102). As one example, a user may apply two-dimensional imagery (e.g., writing, drawings) to a virtual object by tracing the imagery with touch input applied to the input device which may serve as a surrogate for controlling the virtual object. A "gestural input" may refer to both hand gestures performed proximate to an input device as well as touch input applied by contacting the input device.
To enable the detection of gestural input applied to the input device the input device may include a suitable touch/hover sensing system. The sensing system may utilize any suitable sensing technologies, including but not limited to capacitive, resistive, optical, and acoustic sensors.
Alternatively or additionally, an image sensor external to the input device may be used to detect gestural input supplied in relation to the input device.
To this end, Microsoft's patent FIG. 1D shows an image sensor (#162) coupled to computing device (Desktop #102) and configured to detect gestural input applied to the input device. The Image sensor may assume any suitable form, such as that of a depth camera (e.g., time-of-flight, structured light, stereo camera system).
Future Microsoft Pen with 6DOF & Camera
Microsoft's patent Figure 3 below shows an example input device in the form of a stylus that could use six degrees of freedom (6DOF).
Microsoft's patent Figures 4A and 4B illustrate the control of a virtual camera in a three-dimensional scene by the stylus of FIG. 3.
In Microsoft's patent FIG. 4A above, the globe #402 is viewed from a first perspective that is selected according to the three-dimensional orientation of the stylus #300.
A view frustum #404 is associated with the tip of the stylus such that the perspective from which globe is viewed may be adjusted by varying the orientation of the stylus and thereby the relative orientation between the view frustum and the globe.
To illustrate this variance, Microsoft's patent FIG. 4B shows the globe viewed from a second perspective resulting from the rotation of the stylus and view frustum by 90 degrees about a vertical axis (e.g., extending into the page of FIG. 4B).
A different portion of globe 402 can then be perceived from the second perspective relative to the first perspective. While not illustrated in FIGS. 4A-4B, three-dimensional scene #400 may also support the variation of the perspective of the globe in response to translation of the stylus to thereby enable zoom into/out of the globe.
The camera control enabled by the stylus may utilize any suitable combination of user inputs. In one mode of control, the perspective of the globe may change in real-time as the orientation of the stylus changes.
In another mode of control, the orientation of the stylus may not effect changes to the perspective until a suitable user input is received. In this mode, a user may manipulate the orientation of the stylus until a desired orientation corresponding to a desired perspective of the globe is achieved, and supply the user input to effect viewing from this perspective.
The user input may include a single or double tap of a button #406 provided on the stylus for example.
Microsoft patent FIG. 5 below shows another example input device configured for use with the input device of FIGS. 1A-1D. Input device #500 may provide input detectable by a touch/hover sensor.
Microsoft patent FIG. 6 above shows a flowchart illustrating an example method of controlling an application in different modes using a six DOF input device.
Microsoft's patent application that was published earlier this month was originally filed in Q1 2018.
It appears that Microsoft is willing to take on more risk in order to bring new ideas to market, even ideas that Apple had in patent form for years and just sat on. Did this Apple invention inspire the creation of the Surface Book? Did this Apple invention inspire the soft cover keyboard for Surface tablets?
Lastly, Apple had folding devices on record for years and yet it appears that Microsoft and partners will beat Apple to market. Could the tide be turning?