Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

HDR display technology explained: Everything you need to know

HDR, or high dynamic range, is a display feature that can dramatically improve picture quality. Here's how it works.
By
November 1, 2021
HDR video playing on a smartphone display
Robert Triggs / Android Authority

HDR, or high dynamic range, has become one of the biggest selling points for display manufacturers. Just like in photography, HDR aims to recreate images that are more like what our eyes would perceive in the real world.

Put simply, dynamic range refers to the perceived difference between the darkest and brightest parts of an image. If you’re thinking that sounds similar to contrast ratio, that’s indeed true. You’ll find that displays with high contrast ratios tend to do an excellent job with HDR as well.

Having said that, there’s a lot more that contributes to how you perceive a display’s quality than just dynamic range. So in this article, let’s take a closer look at what HDR means in the context of modern displays. Later, we’ll also talk about how you can ensure you’re getting the best possible HDR experience.

What is HDR?

Besides improving dynamic range, HDR bumps a display’s visual fidelity by making images look more realistic and lifelike. How does it accomplish all of that, you ask? Primarily by improving the display’s color-handling capabilities.

The vast majority of older, non-HDR displays were tuned to cover the sRGB (or Rec. 709) color gamut. The problem, however, is that sRGB is a rather dated standard — originally designed for CRT displays and broadcast television. Consequently, it only covers a small percentage of the visible light spectrum. In other words, sRGB displays can reproduce just 25 to 33% of the colors that our eyes can perceive — decidedly insufficient.

To that end, HDR standards propose that we finally move beyond the limited sRGB color space. The general consensus is that HDR displays and content must at least cover the DCI-P3 gamut. For context, DCI-P3 (Digital Cinema Initiatives — Protocol 3) is the color space used by most major theatrical releases these days.

HDR formats require displays to support a wider color gamut than the decades-old sRGB standard.

DCI-P3 is roughly 25% wider than sRGB, which results in a more vibrant and accurate recreation of colors on the display. Many HDR formats and standards are now also preparing for displays to cover the Rec. 2020 color gamut. It is the newest color gamut and covers an impressive 75% of the visible light spectrum.

Beyond color: Brightness, contrast, and bit-depth

Dolby Vision Flower Nits HDR

There are a few more vital aspects to a good HDR viewing experience, starting with brightness and contrast. Non-HDR — or standard dynamic range (SDR) — displays are notorious for lacking the ability to accurately reproduce light and dark sections of an image. Consequently, the entire screen ends up looking washed out or lacking in terms of depth.

With a higher contrast ratio (or dynamic range), however, an HDR display can capture the brilliance of bright areas while still retaining detail in darker areas. Similarly, higher brightness allows these bright sections to pop from the rest of the image. Rather than making the whole display brighter, these are known as specular highlights, such as a glint in a reflection or lining on a cloud.

With a high contrast ratio, displays can capture the brilliance of bright areas while still retaining detail in darker areas.

In that vein, many HDR displays these days also feature higher bit-depth panels to boost visual fidelity. Think of bit-depth as the number of color shades a display can reproduce per RGB pixel. In simpler terms, standard 8-bit SDR displays can reproduce 28 (256) shades of red, green, and blue primary colors. The move to 10-bit offers 1,024 distinct levels for each pixel, while 12-bit has 4,096.

HDR standards require that displays feature 10-bit panels or achieve 10-bit colors through software techniques like dithering. A higher bit-depth is important because it allows the display to smoothly transition between similar colors. Most displays that use dithering rely on frame rate control, which essentially involves quickly cycling between two color shades to give the illusion of an intermediate shade. This practice can allow an 8-bit display, for example, to achieve a 10-bit color depth.

Finally, it’s worth noting that HDR doesn’t boost the clarity or sharpness of your display’s image. That’s resolution, another metric that the display industry has also raced to improve in recent years. While both HDR and resolution can combine to offer great image quality, they’re largely independent of each other. You can experience great HDR on a low-resolution display and vice-versa — it’s just more common to find HDR marketed alongside high resolutions like UHD.

HDR formats explained: HDR10, Dolby Vision, HDR10+, HLG

With HDR, the display industry has once again found itself in the midst of a minor format war. Several different implementations exist today, sometimes with major differences between each one. Thankfully, though, displays and content sources are starting to support multiple HDR formats these days.

HDR10

HUAWEI Mate 10 Pro HDR10 display

HDR10 was the first standard to hit the market all the way back in 2015. Developed by the Consumer Technology Association, it is completely open and royalty-free. This means that any display manufacturer can adopt the standard and advertise compatibility with HDR10 content. The specification’s name stems from the standard’s recommendation of 10-bit panels.

HDR10 also provides metadata to displays, describing the brightness and color levels for a particular piece of content. However, unlike the more advanced formats we’ll look at shortly, the metadata is static from start to finish. In other words, it’s just a set of maximum and minimum brightness values that’s applied to the entire video file.

HDR10 is the most common HDR standard, in part due to its free and open nature.

Thanks to a long early-mover advantage and relatively low technical requirements, HDR10 has become the de facto baseline standard for both displays and content. The list of HDR10-supported devices includes televisions from nearly all major brands, several UHD Blu-Ray releases, streaming services, and even previous-generation gaming consoles.

Dolby Vision

Qualcomm Snapdragon 865 Dolby Vision slides

A well-known entity in the cinema and entertainment industry, Dolby has its own standard for HDR. Like Dolby’s Atmos audio technology, however, it is a proprietary offering. This means that display manufacturers looking to include Dolby Vision have to pay the company a licensing and certification fee.

Dolby Vision goes beyond HDR10 in several ways, starting with support for 12 bits of color depth. It also requires content producers to use more precise mastering equipment, with well-defined specifications for brightness and contrast.

Dolby Vision uses dynamic metadata to communicate how each scene should be displayed.

Dolby Vision uses dynamic metadata embedded within the content to communicate how each scene (or even frame) should look. While early Dolby Vision releases weren’t significantly better than HDR10, that has started to improve. The gap may widen even further once mastering studios gain more experience with this workflow.

HDR10+

HDR10 Samsung demo
Samsung

Think of HDR10+ as an incremental update over the HDR10 standard. The primary difference is that HDR10+ includes support for dynamic metadata. This brings it on par with Dolby Vision, while still maintaining the open and royalty-free nature of HDR10.

Unlike Dolby Vision, however, HDR10+ doesn’t go beyond 10-bit color depth. While most consumer displays don’t come close to 12-bit just yet, that may change in the future. And when that happens, Dolby Vision might have the upper hand.

There’s also the HDR10+ Adaptive standard. In a nutshell, it uses sensors to detect the ambient light in the room and adjust the display’s picture settings accordingly. Having said that, it isn’t nearly as common. Only a fraction of high-end displays include the sensors necessary for HDR10+ Adaptive support.

Hybrid Log Gamma (HLG)

BBC Blue Planet HDR UHD
BBC

HLG, or hybrid log gamma, is a royalty-free HDR standard developed specifically with the constraints of broadcast television in mind.

Unlike the other standards on this list, HLG doesn’t rely on metadata to communicate with the display. This is because OTA broadcasts are more prone to interference than digital streaming over the internet. With metadata, you run the risk of losses in transmission. Instead, HLG uses a combination of traditional gamma and additional logarithmic curves embedded within the content itself to achieve HDR.

Read more: The importance of gamma

The first part, gamma, is recognized by every single display since it is the standard used for describing the brightness of SDR content. The logarithmic curve, on the other hand, describes higher-than-SDR brightness levels and is read exclusively by HDR-compliant displays. This means that HLG is backwards compatible with SDR televisions, eliminating the need to deliver two different video streams, saving on bandwidth. HLG is also not an absolute standard, so it can better adapt itself to displays with varying brightness levels.

HLG is backwards compatible with SDR televisions, making it the perfect choice for analog television broadcasts.

The downside to HLG is that content may look slightly desaturated on older SDR displays that do not have a wide color gamut. SDR displays will also look a bit dimmer than usual because HLG’s white point is lower than the dominant SDR standard, BT.709. Having said that, you still get a perfectly usable image and the result is better than watching any other HDR format on an SDR display. The average television watcher will likely not notice a slight loss of saturation or brightness.

The UK’s BBC and Japan’s NHK were the first major broadcasters to adopt HLG for their broadcasts. If and when HDR displays become widespread, it’s likely that this technique will become the dominant standard for television broadcasts.

Where to find HDR content: Movies, streaming, and gaming

As you’d probably expect, simply owning an HDR display won’t make existing content look better. You also need content that has been specifically designed and mastered with HDR in mind.

The good news is that the majority of new content released these days offers a dedicated HDR stream. If you own a display capable of decoding this stream, it will be automatically picked up. Here’s a quick rundown of where you can find HDR content these days:

  • Streaming: Most streaming providers, including Netflix, Amazon Prime Video, Hulu, Disney+, Apple TV+, and Peacock, support HDR. While some only offer a basic HDR10 stream, a handful like Netflix and Apple TV+ also support Dolby Vision. However, not all content will be available in HDR. Newer releases and content made exclusively for TV are often not mastered for HDR.
  • Video games: Home gaming consoles like the PlayStation and Xbox have supported HDR10 for years at this point. The Xbox Series X and S also support Dolby Vision. While most AAA titles support HDR, keep in mind that not all games do. The same holds true for PC gaming, if you own a relatively recent graphics card. Unfortunately, the Nintendo Switch does not have any HDR support whatsoever.
  • Optical media: While streaming is convenient, enthusiasts have long advocated for optical media, and Blu-Ray in particular, for its superior image quality. With HDR, there’s yet another reason to own discs. Ultra HD Blu-Rays use HDR10 as the base standard, with select titles also mastered for Dolby Vision and HDR10+. Just keep in mind that your display and Blu-Ray player must both also be compatible with the same HDR standard.

Buying an HDR display: What to look for

Quantum Dot LED vs conventional LED

So far, we’ve explored what HDR is and how it has the potential to dramatically improve image quality. However, not all HDR displays offer the same visual fidelity or dynamic range.

Many cheaper displays these days also lack the wide color gamut necessary to properly display HDR content. A combination of these factors can easily lead to a situation where the display accepts an HDR signal but fails to output it properly. In some rare instances, HDR content may even look worse than its SDR equivalent.

All in all, you should stay wary of lower-end HDR displays. Even models that advertise HDR10 support aren’t exactly trustworthy. That label may just indicate compatibility with 10-bit content. Remember that HDR10 is an open standard so you’ll have to trust that the display meets the recommended guidelines. This is commonly referred to as “fake” HDR and can undoubtedly leave a bad first impression of the technology. This quality disparity is why it’s important to look at each parameter individually while shopping for an HDR display.

HDR10 is an open standard and does not guarantee a display’s image quality.

So how does one purchase an HDR display that offers a great viewing experience? The most straightforward way is to delve into the product’s specifications list. Then, just ensure that the display meets or exceeds the following criteria:

  • A wide color gamut — coverage above 80% of the DCI-P3 color space at a minimum. High coverage of the Rec. 2020 / BT. 2020 gamut is an added bonus, but not essential.
  • A brightness rating of at least 500-800 nits. Remember that manufacturers tend to overstate this specification. Higher is always better and you’ll find high-end displays can even exceed 1,000 nits.
  • Backlight features such as global, local dimming, or mini-LED will appreciably improve the display’s contrast ratio. Having said that, some mid-range displays with VA panels tend to omit local dimming because of the technology’s inherent contrast advantage.
  • Format support is also an important point to consider. Some displays only support the basic HDR10 format while others can also playback HDR10+, Dolby Vision, and HLG content. Which ones are important to you depends entirely on the content sources you pair the display with.
HDR displays should offer exceptional brightness, contrast, and a wide color gamut.

Admittedly, some of these specifications aren’t always immediately available or easy to find when shopping for a display. For years, consumers have demanded a universal and regulated standard for HDR, similar to USB or HDMI. Luckily, the VESA interface standards group now offers the DisplayHDR certification that you can use as a reference.

The DisplayHDR standard: A shortcut to good HDR?

While the DisplayHDR standard is far from perfect or all-encompassing, it is still worthy of your attention when picking out an HDR-compatible display. Each product is independently validated, so a manufacturer can only claim it meets the standard if the display passes the test.

The DisplayHDR standard currently has eight performance levels, including three dedicated tiers for emissive displays such as OLED and microLED.

At the lowest end, you have the DisplayHDR 400 certification. As you’d expect, the requirements for it are pretty sparse. Displays only have to include an 8-bit panel and reach a maximum brightness level of 400 nits. The standard also does not require any meaningful coverage of the DCI-P3 color space.

Unfortunately, those metrics are a notch below what most keen-eyed HDR viewers would expect from a display. While DisplayHDR 400 aims to deliver a better experience than SDR displays, it is ultimately not a meaningful enough upgrade.

DisplayHDR 500 and higher performance levels, on the other hand, are a better gauge for HDR performance. These have wide color gamut coverage, local dimming for improved contrast, and significantly higher brightness levels than SDR displays. They also mandate the use of 10-bit panels and coverage of wider color gamuts.

DisplayHDR 500 and higher performance levels offer a decent gauge for a display's HDR performance.

DisplayHDR True Black is a separate standard that ranges from 400 to 600. Emissive displays are capable of delivering deeper blacks and impressively high contrast ratios. However, larger OLEDs don’t get as bright as some higher-end LCDs that use a quantum dot layer. As emissive display technology advances, though, we’ll likely see additional levels added to this standard as well.


And with that, you’re now up to speed on everything there is to know about HDR! For further reading, check out our comprehensive guide to display types and technologies.