Best daily deals
Best daily deals

Links on Android Authority may earn us a commission. Learn more.

Google Play Services update adds Mobile Vision, Text and Awareness APIs

Google Play Services 9.2 reintroduces the Mobile Vision API, able to detect human faces in photos and videos, and also adds a Text API and Awareness API.
By
June 28, 2016
Google Play Services Mobile Vision API

The latest Google Play Services update to v9.2 has reintroduced the Mobile Vision API which was announced last year in v7.8 but subsequently removed. The API allows apps to detect (but not recognize) human faces in images and video footage. Play Services 9.2 also adds a new Text API for character recognition and delivers the Awareness API as well.

The best of Google I/O 2016
Features

Besides face and smile detection, the Mobile Vision API also supports barcode recognition that allows it to scan barcodes in any orientation as well as scan multiple barcodes at once. The Text API can be used to recognize Latin characters in multiple languages via the camera and convert them to plain text. This comes in handy when translating text, scanning documents or adding business cards via a card reader app.

The Google Awareness API outlined at Google I/O 2016 is also contained in the latest Play Services update. The Awareness API allows developers to program contextual awareness into their apps. According to Google, this can all be done with only minimal demands placed on system resources.

The Awareness API works on the basis of two other APIs: Fence and Snapshot. Fence notes when your context changes, so an app could know when you get up and leave the house to go outside, for example. Snapshot lets an app know what you’re doing at a specific time: sitting down texting, jogging and listening to music with headphones in the rain, or walking and taking photo videos at the beach.

Now that the API is fully available, it’ll be interesting to see what devs come up with… and whether or not they’ll start abusing it. Fortunately we will be able to turn off access to the Awareness API “contexts” – things like time, location, weather, activity, etc – via Android Marshmallow’s granular app permissions if things get out of control.

Have you got the update yet? What would you use the Awareness API for?