High dynamic range (HDR) video is one of the biggest 4K TV feature bullet points. It can push video content past the (now outdated) limitations to which broadcast and other media standards have adhered for decades. It’s impressive to see on TVs that can handle it, but it can be a confusing technical feature thanks to several iterations each with differences that aren’t well-defined. We’re here to explain them to you.
Dynamic Range on TVs
TV contrast is the difference between how dark and how bright the picture can get. Dynamic range describes the extremes in that difference, and how much detail can be shown in between. Essentially, dynamic range is display contrast, and HDR represents broadening that contrast. However, just expanding the range between bright and dark is insufficient to improve a picture’s detail. Whether a panel can reach 200 nits (relatively dim) or 2,000 nits (incredibly bright), and whether its black levels are 0.1cd/m^2 (washed-out, nearly gray) or 0 (completely dark), it can ultimately only show so much information based on the signal it’s receiving.
Many popular video formats, including broadcast television and Blu-ray discs, are limited by standards built around the physical boundaries presented by older technologies. Black is set to only so black, because as Christopher Guest eloquently wrote, it could get “none more black.” Similarly, white could only get so bright within the limitations of display technology. Now, with organic LED (OLED) and local dimming LED backlighting systems on newer LCD panels, that range is increasing. Both blacks and whites can reach further extremes, but video formats can’t take advantage of it. Only so much information is presented in the signal, and a TV capable of reaching beyond those limits can only work with the information present.
What Is HDR?
That’s where HDR video comes in. It removes the limitations presented by older video signals and provides information about brightness and color across a much wider range. HDR-capable displays can read that information and show an image built from a wider gamut of color and brightness. HDR video simply contains more data to describe more steps in between the extremes. This means that bright objects and dark objects on the same screen can be shown to high degrees of brightness and darkness if the display supports it, with all the necessary steps in between described in the signal and not synthesized by the image processor.
To put it more simply, HDR content on HDR-compatible TVs can get brighter, darker, and show more shades of gray in between than non-HDR content. Of course, this varies between TVs. Some can get bright and dark enough to do an HDR signal justice, while others do not. Similarly, HDR-compatible TVs can produce deeper and more vivid reds, greens, and blues, and show more color variation in between. Deep shadows aren’t simply black voids; more details can be seen in the darkness, while the picture stays dark. Bright shots aren’t simply sunny, vivid pictures; fine details in the brightest surfaces remain clear. Vivid objects aren’t simply saturated; more degrees of color can be seen.
This requires much more data and not all media can handle it. For example, standard Blu-ray discs cannot support HDR. Fortunately, Ultra HD Blu-ray (which is distinct from Blu-ray, despite the name) can hold more data, and is built to contain 4K video, HDR video, and even object-based surround sound like Dolby Atmos. Ultra HD Blu-rays, however, require Ultra HD Blu-ray players or relatively new game consoles to play them.
Some online streaming services also offer HDR content, but you need a reliably fast connection to get it. Fortunately, if your bandwidth is high enough for 4K video, it can get HDR, too; Amazon Prime Video and Netflix recommend connection speeds of 15Mbps and 25Mbps, respectively, for 4K content regardless of whether that content is HDR or not.
What Is Color Gamut?
This is where HDR gets a bit more confusing. Wide color gamut is another feature high-end TVs have, and it’s even less defined than HDR. It’s also connected to HDR, but not directly. HDR deals with how much light a TV is told to put out, or luminance. The range and value of color, defined separately from light, is called chromaticity. They’re two separate values that interact with each other in several ways, but are still distinct.
Technically, HDR specifically only addresses luminance, because that’s what dynamic range is: the difference between light and dark on a screen. Color is a completely separate value based on absolute red, green, and blue levels regardless of the format of the video. However, they’re tied together by how we perceive light, and a greater range of light means we’ll perceive a greater range of color. HDR-capable TVs can often show what’s called “wide color gamut,” or a range of color outside of the standard color values used in broadcast TV (called Rec.709).
(Credit: Portrait Displays)
This doesn’t mean HDR guarantees a wider range of colors, or that they’ll be consistent. That’s why we test every TV for both contrast and color. Nearly all TVs today can hit Rec.709 values, but that still leaves a large gap between what color those TVs produce and what the eye can actually see. DCI-P3 is a standard color space for digital cinema, and it’s much wider. Most modern 4K HDR TVs should to able to reach the DCI-P3 values, though many don’t quite make it. Rec.2020 is the ultimate, ideal color space for 4K TVs, and it’s wider still—but we’ve yet to see any consumer display reach those levels. And here’s the kicker: Rec.2020 applies to both SDR (standard dynamic range) and HDR, because HDR doesn’t directly address color levels.
The above chart shows the range of color the human eye can detect as an arch and the three color spaces we mentioned as triangles. As you can see, each expands significantly on the previous one.
All of this might seem confusing, but it boils down to this: HDR doesn’t guarantee that you’ll get more color. Many HDR TVs have wide color gamuts, but not all of them. Our TV reviews tell you whether a TV is HDR-capable and what its full range of color looks like.
Types of HDR
HDR isn’t quite universal and is currently split into two major formats, with a few others in the background.
Dolby Vision
Dolby Vision is Dolby’s own HDR format. While Dolby requires certification for media and screens to say they’re Dolby Vision compatible, it isn’t quite as specific and absolute as HDR10. Dolby Vision content uses dynamic metadata. Static metadata maintains specific levels of brightness across whatever content you watch. Dynamic metadata adjusts those levels based on each scene or even each frame, preserving more detail between scenes that are very bright or very dark. By tweaking the maximum and minimum levels of light a TV is told to generate on the fly, the same amount of data that would be assigned across the full range of light an entire movie or show uses can be set across a much more specific, targeted span. Darker scenes can preserve more detail in shadows and brighter scenes can keep more detail in highlights, because they aren’t telling the TV to be ready to show opposite extremes that won’t even show up until the next scene.
Dolby Vision also uses metadata that’s adjusted to the capabilities of your specific display, instead of dealing with absolute values based on how the video was mastered. This means that Dolby Vision video will tell your TV what light and color levels to use based on values set between the TV manufacturer and Dolby that keep in mind the capabilities of your specific TV. It can potentially let TVs show more detail than HDR10, but that ultimately depends on how the content was mastered and what your TV can handle in terms of light and color. That mastering aspect is important, because Dolby Vision is a licensed standard and not an open one like HDR10. If Dolby Vision is available in the end video, that probably means that Dolby workflows were used all the way through.
Dolby Vision is the most widely supported HDR format after HDR10, with content on Amazon Prime Video, Apple TV+, Disney+, HBO Max, and Netflix.
HDR10
HDR10 is the standard pushed by the UHD Alliance. It’s a technical standard with defined ranges and specifications that must be met for content and displays to qualify as using it. HDR10 uses static metadata that is consistent across all displays. This means HDR10 video sets light and color levels in absolute values, regardless of the screen it’s being shown on. It’s an open standard, so any content producer or distributor can use it freely. Every service with HDR content supports HDR10, usually along with Dolby Vision or another HDR format.
(Credit: Will Greenwald)
HDR10+
HDR10+ is a standard developed by Samsung. It builds on HDR10 by adding dynamic metadata, like Dolby Vision. It doesn’t use individualized metadata for each screen, but it still adjusts the range of light it tells the TV to display for each scene or frame. It can potentially add more detail to your picture over what HDR10 shows, and like HDR10 it’s an open standard that doesn’t require licensing with a very specific production workflow.
Currently, HDR10+ is in use by Hulu, Paramount+, and YouTube.
Hybrid Log-Gamma (HLG)
Hybrid Log-Gamma (HLG) isn’t as common as HDR10 or Dolby Vision, and there’s very little content for it yet outside of some BBC and DirecTV broadcasts, but it could make HDR much more widely available. That’s because it was developed by the BBC and Japan’s NHK to provide a video format that broadcasters could use to send HDR signals (and SDR; HLG is backwards-compatible). It’s technically more universal because it doesn’t use metadata at all; instead, it uses a combination of the gamma curve that TVs use to calculate brightness for SDR content and a logarithmic curve to calculate the much higher levels of brightness that HDR-capable TVs can produce (hence the name Hybrid Log-Gamma).
Recommended by Our Editors
HLG can work with SDR and HDR TVs despite the lack of metadata while still holding a much wider range of light data. The only issue? Adoption. It’s developed for broadcasters, and we still don’t see many broadcasters pushing 4K video over the airwaves, cable, or satellite services. HLG still has far to go in terms of content. Currently, it’s mostly being pushed in the UK, with some HLG nature and sports shows.
Each type of HDR offers significant improvements over standard dynamic range, but each has advantages and disadvantages. In terms of adoption, HDR10 and Dolby Vision are the only significant standards that have both plenty of content and widespread TV compatibility. Dolby Vision potentially offers a better picture, but it’s less common than HDR10 because it’s a licensed, workflow-based standard and not an open one. HDR10+ is open, but we’ll need to see more companies actually start to use it before more content becomes available. HLG has the technical potential to become the most universal standard due to its metadata-less nature, but so far it’s seen very little traction.
(Credit: Sony)
What You Need for HDR
Basically, you need HDR content, a medium through which to play HDR content, and a TV that supports HDR signals.
HDR is not simply 4K. A 4K TV might support HDR, but that doesn’t apply to all sets. If your TV doesn’t support HDR, it won’t take advantage of the additional information in the signal. Even if the TV can handle an HDR signal, it might not produce a better picture, particularly if it’s a less-expensive TV. The majority of 4K TVs available today support HDR, though cheaper models might simply not have the contrast or color range to really show it.
Most major streaming services support HDR for some 4K content. There are also UHD Blu-ray discs, which often feature HDR10 or occasionally Dolby Vision HDR.
If your TV supports HDR, it probably has access to at least some streaming services that support HDR. However, it might not have all of them, so you might want to get a separate media streamer. The Apple TV 4K, Amazon Fire TV Cube, Fire TV Stick 4K, Chromecast with Google TV, and Roku Streaming Stick 4K all support HDR10, HDR10+, Dolby Vision, and HLG.
The PlayStation 5 and Xbox Series X both support HDR10 and Dolby Vision for streaming apps as well as UHD Blu-ray playback. Of course, the all-digital versions of the consoles, lacking optical drives, can’t play UHD Blu-ray discs, but they can still stream 4K HDR content.
Is HDR Worth It?
4K is now the effective standard for TVs, and HDR is one of the most important features to consider when buying a new one. It still isn’t quite universal, but both HDR10 and Dolby Vision have proven to offer some compelling improvements in contrast and color over standard definition, and there’s plenty of content to watch with both. If you’re looking to make the jump to 4K and you have the budget for it, HDR is a must-have feature.
Now that you know all HDR, make sure to read up on 8K to see why it isn’t as important as HDR, at least not yet.
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Hits: 0