iPhones will finally benefit from this ultra practical Android feature

Google invests heavily in artificial intelligence and thanks to this, it can offer AI-powered features on all its products. On Android, for example, since 2019, the Mountain View firm has offered a feature called Live Caption which automatically generates subtitles for audio or video content that is played on the smartphone.

This is an accessibility feature for people who are deaf or hard of hearing. But it can also be useful in meetings, if you’re watching a video when you can’t use your headphones or the speakerphone.

And the good news is that iOS users will also be able to benefit from this feature, since Apple has also developed a subtitle generator, which it has also called Live Caption.

On the occasion of World Accessibility Awareness Day, Apple has lifted the veil on a series of new features, including its version of Live Caption. This works more or less like on Android, but Apple offers Live Caption on the iPhone, but also on iPad and Mac. For FaceTime calls, on Mac, in addition to captions, users will also be able to participate in conversations by entering text.

On the other hand, at the moment, like the Android version, Live Caption on Apple products still only supports English. On the other hand, not all models are compatible. The following models are affected: iPhone 11 and later, iPad models with A12 Bionic and later, and Mac with Apple Silicon chip.

Apple specifies that the generation of subtitles is done in the user’s device, which means that the data is not sent to servers.

A beta will be launched this year. And it is very likely that Live Caption will be part of the new features of iOS 16, which Apple will present in June at its WWDC conference.

In any case, little by little, Apple is catching up with Google when it comes to features based on artificial intelligence. For example, in 2021, Apple has announcement the ability to use the Siri assistant without an internet connection. Indeed, like Google, Apple has succeeded in developing a system capable of processing voice commands on the user’s device, without sending the recordings to servers.

This makes the assistant more convenient and responsive, and also helps ensure better privacy protection.

Apple: full of accessibility features

As mentioned above, Apple introduced many accessibility features. For example, for blind or visually impaired people, the firm has developed a feature called Door Detection.

This uses artificial intelligence, the camera and the LiDAR sensor of the iPhone, to help the user in his travels. The feature helps it to find the doors, know if it is open, and also reads the signs and inscriptions on these doors.

© Apple

As for the Apple Watch, it becomes more suitable for people with motor problems. Indeed, thanks to the Apple Watch Mirroring feature, the user can use the accessibility functions present on the iPhone to control the Apple Watch, instead of using the touch screen.

And thanks to its sensors, the Apple Watch can also be controlled with hand gestures without touching the screen. For example, to answer a call or hang up, the user can double pinch with fingers.

“Apple embeds accessibility into all aspects of our work, and we are committed to designing the best products and services for everyone”said Sarah Herrlinger, Apple’s senior director of accessibility policy and initiatives. “We are excited to introduce these new features, which combine the innovation and creativity of Apple teams to give users more options to use our products in the way that best suits their needs and lives. »

We would like to thank the writer of this write-up for this remarkable material

iPhones will finally benefit from this ultra practical Android feature


Take a look at our social media accounts and also other related pageshttps://yaroos.com/related-pages/