Apple unveils new accessibility features that rely on ever more artificial intelligence

Apple unveils new accessibility features that rely on ever more

We’ve all used or tried a few of the accessibility features built into iOS, whether it’s flashing the flashlight when a call comes in or double or triple tapping the back of the iPhone to bring up a application or setting. But these are only the tip of a gigantic iceberg, which make up the accessibility settings.

A few days before World Accessibility Awareness Day, which will take place on Thursday, May 19, Apple has just announced a series of new features for people with various disabilities. They will not be launched until a little later in the year, but the Californian giant has nevertheless detailed some of them in a press release.

Take the (right) door

The first of these is called Door Detection, and is intended for people who are blind or partially sighted. Its objective is to allow the user to find the right door when he arrives in an environment that is unknown to him. Thus, the iPhone on which the function is activated will indicate at what distance the user is from the door, it will even be able to describe it to him, by communicating a possible number or a symbol, beyond indicating to him if it is open. or closed. The iPhone will also specify the type of handle with which the door is equipped, and will indicate to its user whether it will be necessary to push or pull it to open it.

To provide this information, the iPhone (or iPad) uses lidar and camera modules, the data is then processed locally by a machine learning algorithm.

This function will be activated in the Detection mode, within the Magnifier application. It will be added to the People detection function, already available, and Image description. To make it easier for visually impaired people to navigate the outside world, Apple Maps will also add sound and haptic renderings for VoiceOver, in particular to tell users which direction to go when exiting a subway, for example.

On-the-fly captions on iPhone, iPad, and Mac

The other big novelty announced by Apple is Live Caption, a function that will allow you to caption any content that includes an audio track. This could be a way to find out what your interlocutors are saying during a FaceTime call, during a videoconference, in a social media application or when you are watching a streaming video – even if most platforms now offer their own solution. On Mac, during a chat, it will be possible to type an answer on the keyboard and have it read by the machine “aloud”.

This information is always managed locally and does not pass online through the servers of Apple or a partner.

Only small regret, Live Caption will not be available in beta until later this year, and only in English at first. You will need an iPhone 11 or newer, an iPad with an A12 Bionic SoC or later, or an Apple Silicon Mac.

A Mirrored Watch…

In addition, Apple has unveiled a new function that allows a user with motor problems to display the screen of his Watch on his iPhone. He can then control his watch from the smartphone by voice, in particular, or through a tool for monitoring head movements.

This is a way, according to Apple, to allow these people to access the watch’s advanced health functions, such as heart rate monitoring, or blood oxygen levels.

In addition, the engineers of the Cupertino giant will introduce new fast actions on the watch. It will thus be possible to perform a double pinch of the fingers on the screen to answer a call or hang up, to close a notification, or to read a song. A function that could find fans among all users.

Finally, Apple has listed other novelties of lesser importance, but which could however change the lives of people with disabilities. Buddy Controller will allow a player to request help from another player, in order to combine instructions from two different controllers for a single player game.

It will also be possible to adjust Siri’s timeout once it has been requested, so that the assistant waits longer or shorter. Finally, Sound Recognition, which had already been presented in the past, will aim to alert a user if the doorbell of the house has rung or if a fire alarm has gone off.

It is quite possible that Tim Cook’s teams will come back to these announcements on June 6 during the WWDC 2022 opening conference, which will be held mainly online.

Source : Apple

1n-tech