Dual Pixel autofocus is an increasingly popular smartphone camera feature, particularly at the flagship end of the market. The technology promises much faster focusing for action shots as well as superior focusing in low-light environments. But how does it work?
Dual Pixel autofocus is an extension of Phase Detection autofocus (PDAF), which has featured in smartphone cameras for a number of years. Essentially, PDAF uses dedicated left-looking and right-looking pixels on the image sensor to calculate whether the image is in focus. To bring you up to speed, be sure to check out our detailed guide via the link below.
Essential recap: Phase Detection autofocus (PDAF) explained
What is Dual Pixel autofocus and how does it work?
PDAF is the precursor to Dual Pixel autofocus, so understanding how the former works is essential. PDAF is based on the slightly different images created from masked “left-looking and right-looking” photodiodes embedded into the image sensor’s pixels. Comparing the phase difference between these pixels is used to calculate the required focus distance. These phase-detection pixels typically make up just around 5-10% of all the image sensor’s pixels. PDAF is made more reliable and accurate by using a larger number of dedicated phase-detection pixel pairs.
Dual Pixel AF focuses in just milliseconds.
In the move to Dual Pixel AF, every single pixel on the sensor is used for PDAF and aids in calculating phase differences and focus, greatly improving accuracy and speed compared to standard PDAF. Each pixel is split into two photodiodes, one left and right looking. This is often accomplished using micro-lenses placed on-top pixels. When taking a photo, the processor analyses the focus data from each photodiode first before combining the signals to record the full pixel used in the final image.
The diagram above from Samsung’s image sensor team showcases the differences between traditional PDAF and Dual Pixel AF technology. The only real drawback is that implementing these tiny phase-detection photodiodes and micro-lenses isn’t easy or cheap, which becomes an important consideration in very high-resolution sensors.
The 108MP sensor inside the Galaxy S20 Ultra, for example, doesn’t use Dual Pixel technology, while the lower resolution Galaxy S20 models do. The Ultra’s autofocus is worse as a result, but Samsung remedied this issue with an improved image sensor inside the Galaxy S21 Ultra.
How Dual Pixel autofocus improves on PDAF
Despite the shared fundamentals, Dual Pixel technology results in much faster focusing and a greater ability to maintain focus on fast-moving objects than basic PDAF. This is particularly useful for capturing the perfect action shot. Not to mention just quickly pulling up your camera, knowing it will always be in focus. The Huawei P40, for instance, boasts just millisecond focusing times by using this technology. You can see this in action in the GIF below.
One of PDAF’s biggest shortcomings is low light performance. Because phase-detection photodiodes are made up of just half a pixel, noise makes obtaining accurate phase information difficult in low light. Dual Pixel improves this situation by taking far more readings across the whole sensor, smoothing out noise for fast AF even in quite dark environments. There are limits here, but this is arguably Dual Pixel autofocus’s greatest enhancement.
If you’re a serious mobile photographer, a top camera — be it a smartphone or a DSLR — with Dual Pixel autofocus technology onboard helps ensure you always capture the very sharpest snaps. It’s certainly a feature to keep an eye out for if you want to capture the very best photos from your smartphone.
Up next: The best camera phones you can buy