Google brings Gemini and other accessibility tools across Android devices

What you need to know

  • Google is enhancing Android with AI-powered accessibility features, including improved TalkBack with image and screen questioning.
  • New “Expressive Captions” will use AI to provide real-time captions across apps, conveying not just words but also the speaker’s tone and emotions.
  • Developers now have access to Google’s open-source Project Euphonia repository to build and personalize speech recognition tools for diverse speech patterns.
  • This new version of Expressive Captions is rolling out in English in the U.S., U.K., Canada, and Australia for devices running Android 15 and above.

Each year, Google announces a slew a features for Global Accessibility Awareness Day (GAAD) and this year isn’t any different. Today (May 15), Google announced that it is rolling out new updates to its products across Android and Chrome, and adding new resources for developers who building speech recognition tools.

Google is highlighting how it’s integrating Gemini and AI into its accessibility features to improve usability for users with low vision or hearing.

Google brings Gemini powered updates for GAAD

(Image credit: Google)

Google is expanding its existing TalkBack option with the ability for people to ask questions and get responses related to the image that they were sent or are viewing. That means the next time a friend texts you a photo of their new guitar, users can get a description of the image and ask follow-up questions about the make and color, or even what else is in the image.

Post Comment