Google has announced a series of updates to improve the accessibility of its suite of applications. One of them concerns the release of Action Blocks, which allows users to create customizable buttons on the home screen for relatively complex actions like playing music or calling someone which usually require several steps. – tasks which can be difficult for people with reduced mobility or suffering from a cognitive handicap.
“For people with cognitive disabilities or age-related cognitive impairment, it can be difficult to learn and remember each of these steps. For others, it can be long and tedious, especially if you have reduced mobility, “said Google.
Action blocks allow you to perform any action the Google Assistant can take, such as making calls, texting, or controlling devices at a user’s home, with one click. The button can be customized by choosing an image for the action block in the phone’s photo gallery, which is then placed on the user’s home screen for easy access. “We have developed action blocks specifically for people with cognitive decline or age-related cognitive impairment. It is really important to engage with the community,” said Patrick Clary, manager of the Google app AI and accessible products.
Facilitate exchanges and travel
Google has also updated its Live Transcribe and Sound Amplifier apps which were released last year. Live Transcribe uses the microphone of a telephone to automatically transcribe speeches into subtitles in real time. With the new updates, Live Transcribe will offer users the ability to vibrate their phone every time someone nearby says their name to more easily attract the attention of deaf and hard of hearing people.
These three applications are available on the Google Play Store,
users who can use action blocks and Live Transcribe if
their devices are Android 5.0 and above, while Sound Amplifier is
available for Android 6.0 and above devices.
Google has also updated its Maps app on Android and iOS to make it easier for users to see information about wheelchair accessibility. Users will now be able to immediately see information relating to the accessibility of wheelchairs instead of having to consult the details of a place by activating the “accessible places” function. When the “Accessible Places” feature is enabled, a wheelchair icon indicates an accessible entrance and users can see if a place has accessible seats, toilets or parking.
If it is confirmed that a place has no accessible entrance, Maps will also display this information, said Google.
Samsung targets visual deficits
Samsung also announced three new accessibility features for its devices – called Quick Reader, Scene Describer, and Color Detector – to make it easier for people with disabilities to use smart devices.
Quick Reader allows users to get more information about their environment using the camera of a smartphone. It does this by reading text written in real time to help users better understand textual information from everyday life. This feature was created because understanding labels and signs is a daily challenge for users with visual impairments, said Samsung. It can also recognize more than 1,000 common objects and items such as food and vegetables in the kitchen as well as cleaning products.
Samsung has also released Scene Describer, which provides image descriptions, including captured scenes and downloaded images, to help users identify potential obstacles while navigating their environment.
Finally, the Color Detector uses a camera scan to inform users of the color of the item in the frame to help visually impaired people identify materials and clothing design.