Google introduces natural language search and machine recognition for photos
Search is increasingly becoming human. While users from my generation have been used to search operators like “and”, “or” and the plus and minus signs, search queries are becoming more and more natural language like. Take for instance Facebook’s Open Graph search, which launched earlier this year with mixed reception. You can use natural language like “photos of my friends taken in Tokyo,” for instance.
Google has announced a better way to find photos, which uses natural language, as well as photo recognition. This new feature lets you search for photos within your Google+ network using simple queries. To make search even better, Google is now employing “computer vision and machine learning,” which will recognize even generic images based on their characteristics.
With this update, you can simply do a search for “my photos of flowers” and Google will come up with images of flowers from your Google+ photos. You can then add qualifiers to focus your search: “my photos of flowers in New York,” for instance. This has a few implications, of course. First, Google is moving toward doing away with tags and captions. With the new update, Google’s algorithms (neural networks?) will attempt to identify the image based on image recognition.
Product manager Matthew Kulick says this is limited to English searches within Google.com (no support for other country domains, yet), and when you are logged in via Google+.
Going beyond photo search, of course, the bigger implication that comes to mind is that this improvement will tie in well with Google Glass. With Google improving its image recognition technology and having an improved ability in recognizing faces and objects, tying this in with persistent photo and video captured through wearable computers will become a rich source of data and analytics for the search giant.