August 9, 2022
Apple announces AssistiveTouch for Apple Watch, eye-tracking features on iPad among other accessibility updates

Apple has announced several accessibility features designed for people with mobility, vision, hearing and cognitive limitations. These features will be available later this year via a software update. One of the most interesting features is the Apple Watch that allows people to navigate its interface using AssistiveTouch. iPhone and iPad users will also be part of the new accessibility-focused treatment. Additionally, Apple has announced a new sign language interpreter service called SignTime that will be available to communicate with AppleCare and retail customer care.

AssistiveTouch on watchOS will allow Apple Watch users to navigate a cursor on the display through a series of hand gestures, such as a pinch or a press. Apple They say The Apple Watch will use built-in motion sensors such as a gyroscope and accelerometer, along with optical heart rate sensors and on-device machine learning, to detect subtle differences in muscle movement and tendon activity.

New gesture control support through AssistiveTouch will allow people to more easily answer incoming calls, control the onscreen motion pointer, and access the Notification Center and Control Center – all on an Apple Watch – by touching the display or digitally. crown without the need to move. However, the company hasn’t provided any details about which Apple Watch models will be compatible with the new features.

In addition to gesture controls on the Apple Watch, iPadOS will bring support for third-party eye-tracking devices, allowing users to control the iPad using their eyes. Apple says that compatible MFi (made for the iPad) devices will track where someone is looking on the screen so that the pointer can be moved accordingly to follow the person’s gaze. This will work for users to perform various actions on the iPad, including a single tap, without requiring users to touch the screen.

Apple is also updating its preloaded screen reader — VoiceOver — to allow people to explore more detail in images. These descriptions will include text, table data, and other objects. People will also be able to add their own descriptions to the images along with the markup to bring a personalized experience.

Apple VoiceOver Image Description Recognition Update Apple VoiceOver

Apple is updating VoiceOver with more capability details about Images
photo credit: apple

For neurodiverse people or anyone distracted by everyday sounds, Apple is bringing background sounds such as balanced, bright, and deep noises, as well as ocean, rain, and stream sounds to mask the unwanted environment in the background. Will keep moving in or extraneous sound. These will “help users focus, stay calm or relax”, Apple said.

Apple is also bringing mouth sounds like a click, pop, or “ee” sound to replace physical buttons and switches for non-speaking users with limited mobility. Users will also be able to customize the display and text size settings for each app individually. Additionally, there will be new Memoji customizations to represent users with oxygen tubes, cochlear implants and a soft helmet for the headwear.

apple memoji accessibility image apple memoji

Apple’s Memoji customization will get cochlear implants, oxygen tubes and a soft helmet for headgear
photo credit: apple

Along with its primary software changes, Apple is adding support for new bi-directional hearing aids to its MFi (Made for iPhone) hearing device program. The next generation models from the MFi partners will be available later this year, the company said.

Apple is also introducing support for recognizing audiograms — charts that show hearing test results — for headphone housing. This will allow users to upload their hearing test results to the headphone housings so that softer sounds can be amplified more easily and certain frequencies can be adjusted to match their listening abilities.

No concrete timeline has been provided to users as to when they can expect the new features to reach their Apple devices. However, it’s safe to expect some details to be announced at the Apple Worldwide Developers Conference (WWDC) next month.

Apple will also launch SignTime service to communicate with AppleCare and retail customer care directly from a web browser using American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK and French Sign Language (LSF) in France . , It will also be available at physical Apple Stores for remote access to a sign language interpreter, without the need for any prior booking.

apple signtime sign language interpreter service image apple signtime signtime apple

Apple is launching SignTime sign language interpreter service for easier communication with service employees
photo credit: apple

The SignTime service will initially be available from May 20 in the US, UK and France. Apple has plans to expand the service to other countries in the future, though details on that front may be revealed at a later stage.

We dive into all things Apple — iPad Pro, iMac, Apple TV 4K, and AirTag — this week on the Orbital, Gadgets 360 podcast. orbital is available Apple Podcasts, google podcasts, SpotifyAnd wherever you get your podcasts.

Leave a Reply

Your email address will not be published.