Apple will allow you to clone your own voice and use it with text messages

Today is World Accessibility Day.

Oliver Thansan
Oliver Thansan
22 May 2023 Monday 13:41
5 Reads
Apple will allow you to clone your own voice and use it with text messages

Today is World Accessibility Day. For the upcoming iOS, iPadOS and macOS operating systems, Apple is preparing a new battery of advances that will make it easier for people with some degree of disability to use their devices. Among the novelties, it stands out a call Personal Voice (personal voice). Those at risk of losing the ability to speak can use it to create a synthesized voice that will sound just like their own when texting.

Personal Voice is intended for people with diseases that progressively affect the ability to speak, such as Amyotrophic Lateral Disease (ALS). The trained voice model is kept in a secure enclave within the phone, although the user has the option of having it integrated into other of their devices. The training of the model takes about 15 minutes, in which the user must read a series of texts. It can be stopped at any time and resumed until finished. In the tests shown by Apple, the resemblance of the synthetic voice to the original is very accurate. For those who do not speak and do not have the possibility of recording their own voice, Live Speech will be available (live speech), which transforms texts into one of the voices of the Siri assistant.

This feature will come to Apple operating systems starting at the end of this year. On June 5, the company's new batch of operating systems will be presented in Cupertino, at its world developer conference (WWDC), although this year it is expected that the inaugural session will be marked by the possible presentation of a new device for mixed reality (augmented and virtual) together with the operating system to integrate it into your ecosystem. Apple's accessibility news will be enough this time.

Another highlight will be Assistive Access, which transforms the iPhone interface and turns it into a mobile with high-contrast buttons and large text labels, together with tools that help its trusted users, people with cognitive disabilities, to communicate better. . Phone and FaceTime are combined in one button in one Calls app. The other big buttons available are Messages, Camera, Photos, and Music. For those who prefer to communicate visually, the Messages app includes an emoji-only keyboard and the option to record a video message to share. You'll also be able to choose between a grid-based visual layout for the home screen and your apps, or a row-based one for users who prefer text.

Apple has added to the Magnifying glass mode, which amplifies any image, a feature for people with visual disabilities called Point and Talk, in which the image on the screen is combined with the reading of what is being seen when a finger is placed on it . It is useful, for example, for using the buttons on household appliances, but it also detects people and doors, as well as describing images in a way that helps the user to recognize their surroundings. For this feature, the iPhone and iPad models that have a LiDAR scanner are needed, which are the superior ones.

As of today, the SignTime function will begin to function in Spain, an interpreter of the Spanish sign language that will serve people with hearing disabilities in their connection with services from the Apple Store and Apple Support. Apple's senior director of Global Accessibility Policy and Initiatives Sarah Herrlinger said that these features and more that will work in the coming months "were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways."