Samsung Galaxy S20 Ultra vs Apple iPhone Pro Max - PDAF explained

Autofocus technology is one of the key pillars of mobile photography, ensuring crisp clean captures of even the fastest moving subjects. But did you know that autofocusing comes in a variety of types, depending on the sensor inside your smartphone or camera? Today we’re going to dive into Phase Detection autofocus (PDAF), one of the most common types of autofocus.

Phase detection autofocus is found in a lot of modern smartphone cameras. It’s both faster and more accurate than classic contrast detection. Contrast detection is the simplest and cheapest form of autofocus, but also the slowest and least accurate with moving subjects. So what makes PDAF so much better?

What is PDAF and how does it work?

Like all good camera technologies, PDAF traces its roots back to the DSLR. DSLR cameras use mirrors to reflect copies of the main sensor’s light at a dedicated phase detection sensor. Smartphones don’t have the same space luxury to fit all these parts in. Instead, mobile sensors have dedicated PDAF pixels built into the image sensor, an approach borrowed from compact cameras.

The simplest way to understand how PDAF works is to start by thinking about light passing the camera lens at the very extreme edges. When in perfect focus, light from even these extremes of the lens will refract back to meet at an exact point on the camera sensor. A blurry image is a result of this focus/meeting point being set either in front of or behind the image sensor. Adjusting the lens to change this focal point is exactly how camera focusing works.

Camera Lens Focusing - PDAF explained

In other words, we can tell if an image is in focus because even light coming from two different points on the lens converge on a single point. DSLR phase detection autofocus cameras use two dedicated PDAF sensors to capture separate images for comparison. Compact cameras and smartphones don’t have this luxury. Instead, this dual perspective has to be created using dedicated phase detecting photodiodes on the image sensor to itself.

Related: Compact camera vs smartphone shootout

These photodiodes are physically masked such that light from only one side of the lens reaches it. This produces left-looking and right-looking pixels on a single image sensor, giving us our two images with which to compare focus. The phase difference between the two images is calculated to determine the focus point. Samsung’s diagram below offers an intuitive look at this by comparing these left/right pixels to our eyes.

Samsung explains phase detection autofocus PDAF

By obtaining left and right offset images, PDAF works a little like the human eye.

Samsung By obtaining left and right offset images, PDAF works a little like the human eye.

If the image is out of focus, the phase difference data between images is used to calculate how far the lens needs to be moved to bring it into focus. This is what makes PDAF focusing so fast compared to contrast detection. However, with half of the pixel blocked, these photodiodes end up with less light than a regular pixel. This can cause issues with focusing in low light, where traditional contrast detection is still often used as a hybrid solution.

As you can also see, we don’t need to use every pixel on the camera to figure out the focus. Instead, several pixel strips across the sensor will do. Typically only 5 to 10% of sensor pixels are reserved for autofocusing. However, vertical strips mean that cameras can have problems focusing on horizontal lines, so better sensors use cross focus patterns.

PDAF pros and cons

Huawei P40 Pro and P30 Pro comparison rear camera modules - PDAF explained

Compared to traditional contrast autofocus, phase detection auto focus is faster and usually more accurate. Contrast autofocus takes a long time because it has to scan through potentially its entire range of focal points to find the sharpest focus. It’s essentially trial and error. With PDAF, the phase difference is used to almost immediately calculate how far the lens needs to be moved to achieve focus.

Less than 10% of sensor pixels are dedicated to phase detection autofocus.

However, on-sensor PDAF has a few drawbacks compared to DSLR PDAF. The nature of small smartphone sensors and even smaller pixels makes noise an issue, which is problematic in low light situations. Even phase detection autofocus can take several attempts to obtain perfect focus in less than ideal conditions. Although using more pairs of detectors helps speed things up. As a result, smartphones sometimes implement a hybrid approach to tackle this shortcoming.

Phase Detection autofocus is a must-have for the serious mobile photographer. Fortunately, you’ll find this technology in all high-end and even most mid-range smartphones launched in the past few years. In fact, high-end smartphone cameras now include much improved Dual Pixel autofocus. Stay tuned for a deeper dive into that very soon.

More posts about smartphone cameras

Comments
Read comments