A photo of several 4K HDR TVs showing HDR images.
Photo: Brian Finke

If you’re buying a television for the first time in a few years, get ready to learn some new lingo.

You’re going to be hearing a lot about “smart” TVs capable of streaming Netflix shows on their own. And 4K—aka Ultra High Definition, or UHD—sets with four times as many pixels as regular HDTVs.

But according to our testers, high dynamic range is the TV feature with the biggest upside these days.

When done right, TVs with HDR have a higher peak brightness and wide color capability that allows them to present HDR content—already available in many movies and TV shows—more accurately.

But not all TVs with HDR perform equally well. Here’s what you need to know to make a good choice.

What Is HDR, Exactly?

In music, high dynamic range refers to the difference between the softest and loudest parts of the composition. In video, it’s about increasing the contrast between the brightest whites and the darkest blacks a TV can produce.

“When done well, HDR presents more natural illumination of image content,” says Claudio Ciacci, who heads the Consumer Reports TV testing program. “Though HDR demands a higher peak brightness from the TV, it doesn’t mean it has to present a blindingly bright image to the viewer. It simply means the TV has the brightness headroom needed to present the various elements in an image—a shadowy cave, sunlit facial highlights, a brightly lit lightbulb—at a brightness level that is required.”

When HDR is at work, you’ll notice the texture of the brick on a shady walkway or nuances in the white clouds in a daytime sky.

More on TVs

You’ll also see brighter, more realistic “specular highlights,” such as, say, the sun’s reflection off a car’s chrome bumper or an airplane wing. With HDR, those flashes of light pop; without it, they don’t stand out nearly as much.

HDR TVs typically produce more vibrant, varied colors, too. That’s because HDR is often paired with new “wide color gamut” technology, aka WCG.

Think of it as giving your TV a larger box of crayons to play with. Standard HDTVs can display about 17 million colors. Those with WCG can display up to a billion.

But you don’t get to enjoy all that fantastic contrast and color every time you turn on the TV. You have to be playing a movie or TV show mastered to take advantage of HDR and WCG. When HDR TVs first arrived on the market, content with those features was a bit hard to come by, but nearly every streaming service now offers it.)

Photo of an SDR and an HDR image side by side.
The image above shows a side-by-side comparison of a standard and an HDR image.

Types of HDR

While it may sound like one technology, there are a few types of HDR, each with a different set of technical specs.

HDR10 has been adopted as an open standard. Free to use, it’s supported by all 4K TVs with HDR, all 4K Ultra HD Blu-ray players, and all HDR programming.

A number of TVs now, including models from LG, Vizio, and Roku TVs from several brands, also offer Dolby Vision, promoted as an enhanced version of HDR10. Companies pay a licensing fee to use it. Among its advantages, Dolby Vision supports “dynamic” metadata, which allows the TV to adjust brightness on a scene-by-scene or frame-by-frame basis. By contrast, HDR10 uses “static” metadata, setting brightness levels once for the entire movie or show.

But there’s a new called HDR10+ that uses dynamic metadata as well. It’s found primarily in Samsung 4K TVs sold since 2017. While it remains to be seen how widely it will be adopted here in the U.S., it’s supported by Amazon’s 4K streaming video service and some 4K Ultra HD Blu-ray discs, mainly from 20th Century Fox.

The dynamic metadata in both Dolby Vision and HDR10+ can help a midlevel TV that doesn’t have the brightness levels of a top-tier model adapt the content to the set’s limitations. Using a process called “tone mapping,” the metadata can guide the TV to make scene-by-scene or frame-by-frame adjustments according to brightness, color, and contrast variations in the content.

Some TVs now include support for one more HDR format, called HLG, short for hybrid log gamma. If it’s adopted for the next generation of free over-the-air TV signals, which will follow a standard called ATSC 3.0, you’re likely to hear more about it. (It will be important for those who get TV through antennas, which are making a comeback.)

Many newer TVs have built-in support for HLG, and others can receive it via firmware updates if necessary.

Yes, that all sounds complicated.

But there’s some good news. First, your TV will automatically detect the type of HDR being used in a given movie or show and choose the right way to play it. No fiddling required. 

Second, the type of HDR doesn’t seem to be too important right now. Based on what we’ve seen in our labs, a top-performing TV can do a great job with either HDR10 or Dolby Vision.

Our advice: Instead of fretting over the type of HDR, simply buy the best TV you can.

Are All HDR TVs Created Equal?

In a word, no. Our tests show that not every TV with HDR written on the box produces equally rich, lifelike images. 

First of all, TVs are all over the map when it comes to picture quality, HDR or no HDR. But there are also challenges specific to this technology.

Most notably, a TV must be bright enough to really deliver on HDR. To understand why, you need to know your “nits,” the units used to measure brightness.

Better-performing HDR TVs typically generate at least 600 nits of peak brightness, with top performers hitting 1,000 nits or more. But many HDR TVs produce only 100 to 300 nits, which is really not enough to deliver an HDR experience.

With an underpowered TV, the fire of a rocket launch becomes a single massive white flare. With a brighter television, you’d see more intense, lifelike flames, as if you were really there.

“The benefits of HDR are often lost with mediocre displays,” Ciacci says.

How to Tell a Great HDR TV From a Bad One

Unfortunately, you can’t just read the packaging—or even rely on how the picture looks in the store.

Though some TVs carry an “Ultra HD Premium” logo, indicating that they’ve been certified as high-performance sets by an industry group called the UHD Alliance, not all manufacturers participate in the program. LG and Samsung do; Sony and Vizio don’t.

You can’t rely on a TV’s claim of peak brightness, either. Most of those measurements are recorded using a standard industry test pattern, called a 10 percent window, that evaluates the brightness of a small box against a completely black background. But companies can use other methods to produce peak brightness numbers.

What to do instead? Check our TV ratings and buying guide. We now have separate scores for UHD picture quality and HDR performance. Plus, we test brightness in a way that’s different from how most other publications test it.

Remember that 10 percent window pattern? We don’t think it’s a realistic way to determine a TV’s brightness during a regular TV show or movie. That’s why Consumer Reports developed its own brightness test patterns, placing that white 10 percent window against a background of moving video. That gives us a much better idea of the set’s real brightness.

If you look through our ratings, you’ll see that the TVs with the best HDR often tend to be the priciest. But there are some good choices for people who want to spend less. We think HDR performance will continue to be the big differentiator among 4K TVs in 2019, but don’t be surprised if more lower-cost sets start to deliver a satisfying HDR experience, too.