Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Here's how the Pixel 2 and Pixel 2 XL produce amazing shake-free videos

Google has outlined what makes the Pixel 2 and Pixel 2 XL’s video-recording set-up so special.

Published onNovember 13, 2017

The Google Pixel 2.

With all the negativity surrounding the Pixel 2 and Pixel 2 XL, it’s sometimes been a little easy to forget that Google’s latest flagship smartphones have a lot going for them. In fact, in amongst the myriad reports of display issues, clicking noises, and non-functional microphones, there’s one area where the Pixel 2 devices have received nothing but praise: the cameras.

As we showed in our review, the 12.2 MP sensor on the second-generation Pixels are capable of producing gorgeous photos, making them more than worthy of the accolades they’ve received. But it’s not just in the photography stakes that the Pixel 2 camera suite shines.

Before launch, Google talked up both phones’ ability to capture shake-free video thanks to a careful combination of hardware and software. Now, in a Google Research Blog post, the search giant has outlined what exactly makes the Pixel 2’s videography set-up so special, and how the development team achieved such an incredible feat.

The overarching technique behind the Pixel 2’s magic is something the company calls Fused Video Stabilization. Google explains that this is essentially a combination of optical image stabilization (OIS) and electronic image stabilization (EIS), but the trick, as with most of Google’s recent inventions, is the small dose of machine learning AI that just so happens to govern the whole system.

The Pixel 2 is the Android experience everyone deserves

The three-stage process begins with OIS, where the lens module within the sensor senses and compensates for the device’s variable movement in your hands to reduce handshake blur.

This is simultaneously monitored by the phone’s high-speed gyroscope and run through a “lookahead” filtering algorithm.

According to Google, this temporarily pushes incoming frames into a queue, defers the processing, and enables machine learning and signal processing algorithms to accurately predict the user’s hand movements. In short, the phone will know if you’re travelling along a bumpy road and adapt to the increased handshake to stop severe variations in sharpness.

The final stage brings together the real and virtual camera motions while removing the rolling shutter and focus breathing distortion. Arguably the most impressive part of the whole system is that it works for recordings at up to 60fps and in 4K resolution with relative ease.

You can see FVS in action in the video below and read more about the technology in the full in-depth post here.

Have you been impressed with the Pixel 2’s video prowess? Let us know in the comments.

You might like