Apple introduced a new set of accessibility capabilities for its different computing platforms on Tuesday morning, which will be available as software upgrades for the iPhone, iPad, Mac, and Apple Watch later this year.

Apple says it will beta Live test Captions, which can transcribe any audio content in English across the iPhone, iPad, and Mac, including FaceTime, chats, video conference apps (with auto attribution to identify the speaker), and streaming video, and in-person conversations. Google’s effort for Live Caption began with the release of Android 10 and is now available in English on the Pixel 2 and later devices, as well as “select” other Android phones and other languages for the Pixel 6 and Pixel 6 Pro. It’s encouraging to see the Apple ecosystem catch up and reach even more people.

Apple claims that, like Android, its captions will be created on the user’s device, keeping personal information private. For iPhone 11 and after, iPads with the A12 Bionic CPU and later, and Macs with Apple Silicon CPUs, the beta will launch later this year in the United States and Canada.

Quick Actions, which identify a double pinch to stop a call, dismiss notifications, snap a picture, pause/play media, or start a workout, will be added to the Apple Watch’s Assistive Touch gesture recognition controls, which were launched last year. We describe how to use your Apple Watch hands-free here to learn more about what the gesture controls already accomplish and how they work.

Apple

With a new mirroring function that adds remote control from a paired iPhone, the Apple Watch is becoming easier to use for people with physical and motor limitations. Apple Watch Mirroring incorporates AirPlay technology, making it easier to use the Watch’s unique capabilities without relying just on your ability to tap on its small screen or what voice controls can do.

With iOS 14, Apple introduced Sound Recognition, which detects specific sounds such as a smoke alarm or running water and alerts users who are deaf or hard of hearing. Sound Recognition will soon allow for tuning, allowing for personalized sound recognition. It may listen for repeated notifications and learn to key on alerts that are peculiar to the user’s environment, such as an unusual doorbell alarm or appliance ding, as illustrated in this screenshot.

Support for 20 new “locales and languages” will be added to Apple’s VoiceOver screen reader app, Speak Selection, and Speak Screen features, including Arabic (World), Basque, Bengali (India), Bhojpuri (India), Bulgarian, Catalan, Croatian, Farsi, French (Belgium), Galician, Kannada, Malay, Mandarin (Liaoning, Shaanxi, Sichuan), Marathi, Shanghainese (China On the Mac, VoiceOver’s new Text Checker will look for errors such as extra spaces or capital letters, while VoiceOver users on Apple Maps can expect improved sound and haptics feedback that will show where to begin walking directions.

For Door Detection, Apple claims that on-device processing will use lidar sensors and the cameras on an iPhone or iPad. The new iOS function will assist users locate entryways in unfamiliar locations, tell them where they are, and describe whether they work with a knob or a handle, as well as if they are open or closed.

This is all part of the Detection Mode that Apple is adding to Magnifier in iOS, which also collects existing functions like zooming in on surrounding items and describing them, as well as recognizing individuals and alerting the user with sounds, speech, or haptic feedback. Because the lidar sensor is required for People Detection and Door Detection, an iPhone Pro or iPad Pro with the feature is required.

Buddy Controller, which merges two game controllers into one so that a friend may help someone play a game, is another new feature on the way, similar to Xbox’s Copilot feature.

Finally, there’s a Voice Control Spelling Mode with letter-by-letter input, settings to regulate how long Siri takes to answer requests, and further visual adjustments for Apple Books, including the ability to bold text, change themes, and adjust line, character, and word spacing to make it more readable.

The announcements are part of Apple’s celebration of Global Accessibility Awareness Day, which takes place this week on May 19th. It also mentions that Apple Store locations will host live sessions to assist customers to learn more about existing features. A new Accessibility Assistant shortcut will be available this week for the Mac and Apple Watch to recommend specific features depending on a user’s preferences.

Source