At Google I/O back in May, the online search giant announced the impressive Google Lens feature that provides contextual analysis of images thanks to machine learning. To give you an example, you can point your camera at a poster for an event and Google Lens will save the date to your calendar.

Although the feature isn’t officially available yet, you can already test it out in a way. Google has released an update for its Photos app (v3.5), which provides a working interface to Google Lens.

The guys over at XDA-Developers have figured out that, thanks to the update, they can now send an image to Google Lens, which then provides more info about it. They tested this out with an image of the Eiffel Tower that Google Lens instantly recognized. The second test was a bit harder, as the image contained a book, but the feature still worked as advertised. You can check out the two screenshots below to see what this looks like in action.

See also:

The only problem here is that trying out Google Lens isn’t really a super easy thing to do unless you’re a developer. The Photos update added a new intent filter that can accept images sent to it. So to try it out, you have to set up the proper intent that will allow you to open images on your device with the help of Google Lens. Check out XDA-Developer’s post for more detailed instructions here.

XDA-Developers also did an APK teardown to learn more about the functionalities we can expect to see in Google Lens. The feature is apparently capable of scanning for books, movies, landmarks, buildings, and music albums, among other things. It can also detect and open URLs, and allows you to quickly save email addresses and phone numbers it detects to your device with a simple tap.