- Google Assistant is now more accessible through Tobii tablets and apps.
- Eye tracking, touch and scanning make Assistant an option for those with disabilities.
- You can use Tobii to create Action Blocks on Android, too.
Google Assistant can be very helpful for questions and day-to-day tasks, but it can be difficult to use if disabilities make it difficult to speak or use a conventional touchscreen. That might not be an issue for much longer, however. Google has partnered with Tobii to improve Assistant’s accessibility and put the AI helper within reach of many more people.
Google Assistant is now integrated with Tobii’s Snap Core First on the company’s Dynavox tablets and mobile apps, making it easy to perform a command using eye tracking, scanning, or touch. You can ask about the weather or turn on lights just by looking at or touching a tile, rather than having to speak a command or navigate a complex interface.
Read more: The best accessibility apps for Android
Setup is relatively straightforward. Once you have a Google account, you configure an Assisstant-based smart speaker or display in the Google Home app on your phone or tablet. Grant the Snap Core First accessibility app permission and you can tell tiles to issue Google Assistant commands.
On top of this, Tobii Dynavox can assign its Picture Communication symbols to create Action Blocks buttons on Android devices, bringing that familiar accessibility interface to Google’s platform. People with cognitive disabilities can send texts, play videos and otherwise perform common mobile device tasks without having to re-learn buttons.
Google added that Assistant had “always” been made with accessibility as a feature, but these additions could be particularly important. Many with disabilities don’t have full speech or touch capabilities — this opens assistant technology to them where it might not even have been an option before.