Apple Previews a Powerful new Apple Watch feature called AssistiveTouch that uses patented wrist-gesture technology & more
Apple today announced powerful software features designed for people with mobility, vision, hearing, and cognitive disabilities. These next-generation technologies showcase Apple’s belief that accessibility is a human right and advance the company’s long history of delivering industry-leading features that make Apple products customizable for all users.
Introducing AssistiveTouch
Later this year, with software updates across all of Apple’s operating systems, people with limb differences will be able to navigate Apple Watch using AssistiveTouch.
To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.
Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench.
AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.
Apple has been working on this for years and last year gained two granted patents regarding this avenue of technology (01 and 02). Below you'll find a group of patent figures from one of those patents followed by an Apple video that explains a few of the gestures, including a clenching fist gesture outlined in Apple's patents.
In addition, later this year the iPad will support third-party eye-tracking hardware for easier control; and for blind and low vision communities, Apple’s industry-leading VoiceOver screen reader will get even smarter using on-device intelligence to explore objects within images.
In support of neurodiversity, Apple is introducing new background sounds to help minimize distractions, and for those who are deaf or hard of hearing, Made for iPhone (MFi) will soon support new bi-directional hearing aids.
Introducing the new SignTime Service
Tomorrow, Apple is launching a new service called SignTime. This enables customers to communicate with AppleCare and Retail Customer Care by using American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK, or French Sign Language (LSF) in France, right in their web browsers.
Customers visiting Apple Store locations can also use SignTime to remotely access a sign language interpreter without booking ahead of time. SignTime will initially launch in the US, UK, and France, with plans to expand to additional countries in the future. For more information, visit apple.com/contact.
Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives: "At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make. With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can’t wait to share them with our users."
Read Apple's full press release to learn more about eye-tracking support coming to iPad; Exploring images with VoiceOver; Made for iPhone Hearing Aids and Audiogram Support; using Background Sounds to help minimize distractions and help users focus, stay calm, or rest and much more.
Once again, Apple patents predicted this technology long ago and now it's here. Imagine that, the leakers of this world missed this feature coming.
Comments