Google has just expanded the language support for its accessibility app, which enables individuals with speech and mobility difficulties to communicate only with their eyes. The software already supports 17 more languages.
Look to Speak, a Google accessibility software that enables users to communicate with their eyes instead of words, is now accessible in a number of additional languages. The app was first unveiled in December 2020, but it has only been made accessible in English so far, as has been the case with most of Google’s efforts. It is now compatible with a total of 17 more languages.
It is part of Google’s strategy to make its smartphones and other gadgets more accessible to people of all backgrounds. One-handed mode for big smartphones running Android 12 was also recently announced by the American firm, which had previously created Relate, a voice assistant that assists people with speech impairments.
Eye movements are tracked by this software, which allows the user to pick pre-written statements and have the phone pronounce them aloud. For people with different forms of infirmities that restrict them from utilizing their limbs or their voices, it is thus a useful tool to have.
The application’s functionality is really straightforward. Two columns of often used words and phrases are shown on the interface. To begin, you must find the required term on the screen. Then just shift your gaze to the left or right to choose it, or construct phrases from a number of different words. As soon as you have chosen a phrase, the phone will read it out to you.
You will be able to confirm your selected phrase by using your smartphone’s front camera to record your eye movements, which will then be processed by machine learning algorithms. Not all relevant phrases are shown on the two columns, of course, but the program may be customized to allow for the insertion of new sentences that will ease conversation with your loved ones, as well as the adjustment of the sensitivity of eye movement detection. The video included below demonstrates how it works.