On Wednesday, Apple revealed the first iOS 15 features – new accessibility features making products easier to interact with. In the upcoming WWDC 2021 keynote’s iOS 15 and iPadOS 15 presentation, the new accessibility features would be the best segment. Apple intends to celebrate Global Accessibility Awareness Day (May 20th) with this announcement. AssistiveTouch is the futuristic new tech arriving at Apple Watch
One of the many new accessibility features that Apple rolled out, is the new AssistiveTouch functionality for Apple Watch. The wearable will use data from sensors such as an accelerometer, gyroscope, and heart rate sensor, combined with on-device machine learning to translate hand gestures into actions on the screen. The feature allows users with limited mobility who are unable to touch the Watch display to interact with the device. However, this mind-reading feature is much more than accessible for Apple. It appears like sci-fi tech recently demoed by Facebook for interacting with smart devices such as Virtual Reality (VR) headsets.
AssistiveTouch for Apple Watch targets a particular user, with limited mobility and those with upper body limb issues. According to Apple, the technology works as follows:
“Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.”
The video that Apple shared in the press release shared all the hand gestures that the wearable can detect.
Two months ago, Facebook demoed a similar technology that uses wearable devices on the wrists and interprets neural commands from the brain that would produce many hand gestures. Those hand gestures would then be turned into actions on the virtual screen of a VR headset. Actions like navigating menus, interacting with UI elements, or typing on a keyboard, would all be easily possible as the wearable interprets the user’s gestures and translates them into actions on the display.
“You have more of your brain dedicated to controlling your wrist than any other part of your body, probably twice as many neurons controlling your wrist and the movement of your hands than is dedicated to your mouth for feeding and speech,” Facebook Reality Labs TR Reardon said during the mid-March briefing.
At the time, Facebook CTO Mike Schroepfer admitted that “it’s hard to predict” the timeline of having the concept wearable device deployed commercially. “How these things sequence out in the market when they show up — are things I don’t have crisp answers to. What we’re focused on is hardening these technologies,” he said at the time.
Unlike Facebook, Apple already has a powerful wearable on its credit. The Apple Watch is the most popular wearable in the world, and Apple could use its “mind-reading” features to bring the same sort of Minority Report user experience that Facebook is envisioning to future Apple products.
Facebook’s concept involves the user wearing two wrist devices, however. The VR gadget would be triggering action gestures from both hands. Assuming Apple is already struggling at turning the Apple Watch into a gadget that will be helpful to enhance AR and VR experiences. It is being speculated that Apple could always develop a secondary watch-like accessory packed with sensors for the second wrist.
Surely, AssistiveTouch on Apple Watch will release later this year, more likely well before Facebook rolls out any “mind-reading” wearable accessory for Oculus VR experiences. Facebook’s presentation from mid-March follows below, depicting how its mind-reading tech should work.