Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Here's how Google brought Pixel 3's Top Shot mode to life

Between machine learning models and human testers, Google's Top Shot is a step above traditional best photo modes.

Published onDecember 21, 2018

Google reveals the Top Shot feature at its Pixel 3 event.
  • Google has revealed how the Pixel 3’s Top Shot mode was created.
  • The mode takes smiles, lighting, technical data, and other factors into account when choosing the best shot.
  • Humans also rated Top Shot photos in order to further refine the feature.

Best photo modes aren’t new in the smartphone world, as we’ve seen everyone from Samsung to Huawei recommend the best snap from a burst. The Pixel 3’s Top Shot mode is more complex than these previous efforts though, but just how does it work? Well, Google’s AI Blog has given us the complete rundown…

For the uninitiated, Top Shot essentially saves and analyzes up to 90 images taken just before and just after the shutter key is pressed. Google’s blog post explains that after analyzing these images on-device, the camera recommends the best snap based on a variety of factors. These criteria include smiles, open eyes, lighting, and emotional expressions. But technical information, such as exposure time, gyroscope data, and optical flow are also taken into account.

Snapping and analyzing 90 images can be a strain on the phone, but Google says it prioritizes the shutter image first, followed by the best alternative snaps. The company adds that the Pixel Visual Core chip is used to process these top alternative shots into HDR+ images.

The company also used several machine learning models to help Top Shot actually work. A “vanilla MobileNet model” was used to identify blurry subjects, open/closed eyes, and emotional expressions. From here, the company then used a “Generalized Additive Model” to score faces.

Smartphone photography tips: 16 handy tricks you should know
Smartphone photography tips.

But what if there aren’t any faces in the scene? Fortunately, Google’s frame scoring model was also tuned to check for object motion, motion blur, and auto exposure/white balance/focus.

Finally, the company asked volunteers to determine which frames were the best. After all, what’s the point of all the machine learning if most humans find the Top Shots to be terrible?

It all comes together to give the traditional best photo functionality a welcome shot in the arm in theory. And with other features like HDR+, Night Sight, and a better single-camera portrait mode, it’s clear that the Pixel 3 is a photography beast.

What’s your favorite smartphone camera feature right now? Let us know in the comments section!

NEXT: Galaxy S10 could have a low-light camera feature similar to Google’s Night Sight


You might like