• Home
  • Guides
  • What Is HDR and Is It Worth the Hype?

What Is HDR and Is It Worth the Hype?

Updated: December 22,2022

Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

HDR, or high dynamic range, is one of the most annoyingly omnipresent marketing buzzwords in modern TVs and gaming monitors. Along with 4K, you’ll find it on almost every piece of advertising, promising colors so vivid and lifelike you’ll feel like you’re in the scene. 

But what is HDR, and how does it actually work? Is it truly the next step in the evolution of displays, or merely a handy marketing gimmick? Let’s find out.

HDR Basics 

Before we move on to a more complex breakdown of HDR, let’s first explain some core concepts about the technology. Although the term can technically refer to all sorts of dynamic ranges, including those for radio and audio signals, this article will be concerned with the most common definition: the dynamic range of images and videos.

What Does HDR Do?

If we were to provide an HDR definition in the context mentioned above, it is simply a set of imaging techniques that offer a larger dynamic range for images and videos. By larger, we mean extended compared to the standard dynamic range used on older televisions and computer monitors.

This effectively increases the luminosity range between the image's darkest and brightest spots, offering greater detail and clarity in both brightly lit and darker, shadow-laden parts. The key takeaway here is that a true HDR source can tell your monitor or TV precisely what brightness or color on a much wider spectrum it can use. 

The quality of the end result will depend on how good the HDR TV or monitor is, which is why there are different HDR certifications, starting from HDR 400. These refer to the peak brightness or luminance of the screen.

Although contrast itself isn’t directly linked to HDR, newer high-quality HDR-capable displays typically boast much higher contrast values as well, offering a picture that looks more natural and lifelike. Speaking of natural-looking images, let’s talk about perhaps the defining feature of HDR - the increased color gamut or palette.

What Is Color Gamut, and Why Is It Important?

If you ask the average user, “What does HDR mean?” chances are they’ll either stare at you blankly or talk about a richer color palette. While a high dynamic range offers more than simply an extended color gamut, this is arguably the most obvious aspect the human eye will pick up on.

The color gamut shows us the number of colors a display is capable of producing. The higher the gamut, the more subtle the color variations that can be shown. This adds to the realism and stops similar colors, such as green and blue or red and orange, from blending together and creating a less natural-looking image.

Regarding display technology, we measure the color gamut through various standards such as sRGB, AdobeRGB, DCI-P3, and NTSC. sRGB is a long-standing standard developed by Microsoft and HP in 1996, while AdobeRGB is what graphic designers and photographers use and was designed by Adobe Systems.

Most screens feature a color gamut between 70% and 75% NTSC, which roughly translates to around 100% of the sRGB gamut. But what is an HDR TV most likely to use for color gamut measurement these days? The newer, high-definition DCI-P3 standard released by a consortium of movie industry titans such as Sony Pictures, Warner Bros, and Twentieth Century Fox.

DCI stands for the organization’s official name, which is Digital Cinema Initiatives, and the standard represents a middle-ground between sRGB and NTSC, featuring a 25% larger color gamut than the former but a 4% smaller one compared to the latter.

Nowadays, the most used HDR monitor and TV color gamut measurement is DCI-P3. It’s an abbreviation for Digital Cinema Initiatives, which includes movie industry giants like Warner Bros, Universal Studio, Metro-Goldwyn-Mayer, and Sony Pictures. 

If you want to experience the full HDR effect and get the highest possible image quality for an immersive gaming experience, or your work requires exceptionally accurate color reproduction, you’ll want a high color gamut screen.

Such HDR displays usually cost a pretty penny, but the higher dynamic range combined with a better contrast ratio will make HDR content look phenomenal.

HDR Formats and Standards

Whether you’re slumped in your fancy gaming throne viewing high dynamic range images on a fancy AMOLED display or browsing HDR content on Netflix or other streaming services, you may wonder what Dolby Vision or HDR10+ mean. 

Not all HDR TVs support every HDR format out there, so it pays to know more about them and what you’ll be getting with each.

Dolby Vision

This very popular HDR format is designed by Dolby Laboratories. It uses dynamic metadata, which ensures that information about the display's capabilities is stored separately from the video signal. This way, the display can use it to adjust the image according to precise instructions.

This makes it excellent for displaying dynamic content, such as video games. Sadly, it’s a proprietary form of HDR, meaning that developers and manufacturers need to pay licensing fees to make use of it.

HDR10

While Dolby Vision is certainly gaining popularity these days, HDR10 is still the most widely used HDR format. The key difference between them is that HDR10 uses static metadata, which encodes info about the display directly into the video signal. This information is then used to adjust the image parameters as needed.

The advantage of this method is that it's relatively simple to implement and doesn't require any special hardware. Better still, because it's a royalty-free open standard and not proprietary tech, anyone can use it without having to pay licensing fees.

Unfortunately, HDR10 isn’t as flexible as Dolby Vision and doesn’t provide the same amount of brightness. So, what is HDR10 capable of? Its peak brightness value is only around 1,000 nits, while Dolby Vision can reach up to 10,000 nits. That said, very few commercial HDR-capable TVs and monitors have peak brightness over 1,000 nits anyway.

HDR10+

So Dolby and HDR10 each have their strengths and weaknesses, but what if we combine them to create something unique? This was Samsung’s idea when creating this advanced HDR format, which builds on traditional HDR10 by adding dynamic metadata, allowing it to change display information in each individual frame.

HDR10+ ramps the peak brightness up to 4,000 nits and is backward-compatible with HDR10 devices, but the catch is that it has so far been limited only to models by Samsung and Panasonic and hasn’t gotten much commercial support.

HLG

What is HDR, if not an ever-evolving platform? HLG (Hybrid Log-Gamma) is an advanced format developed by NHK and the BBC, which completely eschews metadata for a special encoding that utilizes a mixture of an SDR signal and a gamma curve.

Unfortunately, it is a fixed encoded signal that provides very little flexibility, which explains why it’s received very little support compared to the more established Dolby Vision and HDR10 standards.

Is HDR Worth It? 

At this point, you may be thinking HDR sounds like a great idea and should hopefully become the display standard in the near future. Or, perhaps, you are unconvinced and still feel like it’s an expensive marketing gimmick. 

As always, the truth lies somewhere in the middle, so let’s break down the technology’s main advantages and disadvantages, so you have a clearer idea of where things stand. 

Benefits of HDR

As we already mentioned, one of the key HDR benefits over SDR rendering lies in the far more natural and realistic color reproduction.

The vastly increased color gamut goes hand in hand with higher peak brightness, and regardless of the resolution or aspect ratio your monitor supports, modern HDR-ready screens will feature far superior contrast and black levels compared to their older counterparts.

All of this contributes to far more immersive gaming or video-watching experience, with more lifelike visuals, brighter whites, darker blacks, and far more discernable details in shadows and highlights.

HDR Drawbacks

Compared to its advantages, the downsides of using HDR tech may seem almost insignificant, but it’s worth mentioning them anyway. For starters, while HDR compatibility is getting better by the day, not all digital content, and especially not all video games, fully support it yet. 

This, combined with the rather high price of true HDR displays like high-end OLED and QLED TVs (and more recently exorbitantly-priced computer monitors), makes it somewhat of a luxury tech still. 

This is especially true if we consider how much more power is needed to reach the high peak brightness levels that HDR tech entails, leading to some decidedly energy-bill-unfriendly large screens on offer these days.

Final Thoughts

Although it’s been around for a long time, HDR is still a technology that isn’t quite as widespread as its creators have perhaps hoped for. Even so, HDR support has been growing steadily over the past decade. 

As the technology matures and more filmmakers, artists, developers, movie enthusiasts, and gamers embrace it, it’s only a matter of time before the standard dynamic range will get phased out.

FAQ

Is HDR the same as 4K?

No, not at all. Although the two are sometimes confused due to both being very prominent terms in display marketing, the two terms mean entirely different things. HDR, or high dynamic range, offers a wider color range compared to the standard palette, whereas 4K refers to the picture resolution or the number of pixels in the displayed image.

Is 4K HDR better than HD?

Absolutely. 4K resolution offers 3860 x 2160 pixels, while HD or 720p is only 1280 x 720 pixels. People often confuse HD and Full HD or FHD, which is 1920 x 1080 pixels, but both of these resolutions are far less sharp-looking than 4K.

What is HDR, and do I need it?

HDR stands for high dynamic range and in image and video technology refers to a color palette wider than the traditional SDR or standard dynamic range. HDR screens also boast higher peak brightness, allowing for more realistic-looking highlights.

Strictly speaking, you do not need HDR, but it’s a clear visual upgrade on most displays except the most budget HDR 400 certification models, which hardly qualify as true HDR screens anyway.

Leave your comment

Your email address will not be published.


Ivan
ABOUT AUTHOR
Ivan

A true tech and gaming savant, Ivan has been fascinated by the digital world since the early days of gaming on antiques such as the ZX Spectrum and Commodore’s beloved Amiga. Whether you’re interested in the latest PC and console gaming news, antivirus software, or smartphone reviews, or simply want to learn about the newest geeky gadgets around, we at KT have you covered, and Ivan’s likely the one we’ll ask.

Selected 1 items
Clear All