TV manufacturers always look for the next leap in picture quality that will make watching TV feel like you’re looking through a crystal-clear window. HDR is the latest trend in display technology and it’s here to stay. Here’s everything you need to know about how it works, and why you may want to consider it when you buy your next TV.
HDR stands for “high dynamic range,” and while it shares a name with the photo technology on your smartphone’s camera, they actually aren’t related at all. HDR TVs are designed to create a more realistic picture by representing colors in a more lifelike way. By having a greater contrast between dark and bright colors, and using more colors overall, HDR TVs can better show you what the camera actually saw when it was recording, and ideally, what you would see if you were watching live.
For example, in a HDR-enabled nature video, trees look greener, the sky looks bluer, and clouds look more defined because there are more colors to work with and more color contrast. HDR expands the total color range of the TV’s display, and increases the number of steps between each color within that range. If you were to go from, say, black to white, a standard 1080p or 4K TV could have around 10 different shades of gray in between the two. HDR TVs would have closer to 1,000. According to Nandhu Nandhakumar, Senior Vice President of LG Electronics, a bright star twinkling in a night sky is a good example of HDR doing its best work: the star would be a crisp white while the area around it would look dark, black, and not washed out. The star would also be physically brighter because HDR screens take the top end of the color range, or highlights, and increases the actual brightness of those pixels. That brightness, or luminance, for each color is measured in “nits,” or candela per square meter. Depending on your HDR TV, colors can range anywhere from 0.0001 nits all the way up to nearly 10,000 nits or more (as shown in the flower image above). To put that in perspective, the current HDTV and Blu-ray standards start at a minimum of 0.117 nits and limit maximum brightness to 100 nits, in addition to having a much more limited color range.
In short, HDR really makes bright colors pop and keeps dark colors dark. So if you see the sun on your TV, it will be very bright, making it feel like you’re actually looking toward the sky. Or if you see a match lit in a dark room, the match will actually feel like the only source of light. In essence, HDR tech improves pixels rather than increasing their number or improving a screen’s refresh rate, which is what TVs have done to boost picture quality for the past few years.
There are two primary standards when it comes to HDR TVs and HDR TV content: HDR10 and Dolby Vision. HDR10 is the open standard of both OEMs and content creators in the TV industry, but you’ll never see it actually listed as “HDR10” when you’re shopping for TVs. Each manufacturer calls it something different, but if you see that a TV supports HDR in some way, you can assume they mean HDR10. Because it’s an open standard, content creators can use it without paying licensing fees.
Unlike HDR10, Dolby Vision is a proprietary HDR standard that does more, at a price. Dolby Vision supports a wider range of color luminance (up to 10,000 nits, HDR10 maxes out at 1,000), and Dolby Vision content is mastered with 12-bit color depth (HDR10 is only 10-bit). That means Dolby Vision has a possible 68 billion colors it can use, where HDR10 has a little over one billion. Current non-HDR TVs top out at about 16 million colors.
The biggest difference, however, is that every frame of Dolby Vision content has metadata that tells your HDR TV how to display that specific frame. HDR10’s metadata doesn’t change from one frame to the next, and has the same instructions set for all frames, so you don’t get the same level of visual clarity you would with Dolby Vision, especially if a movie does a lot of bouncing back and forth between light and dark environments. That said, Dolby Vision requires its HDR content to be played through a compatible player and output to a compatible display. Because manufacturers have to pay for a special Dolby Vision chip, certification process, and proprietary licensing fees, they have to extend those extra costs to you as the consumer.
For example, this 55-inch HDR10-enabled LG TV costs $800. The same model with HDR10 and Dolby Vision costs closer to $1,300. All in all, Dolby Vision has the best specs, and most would say it looks better than HDR10, but it isn’t cheap, and adoption has been slower.
It’s also important to know which manufacturers support which standards. You can find TVs that support both, but there are also TVs that only support one or the other. Vizio and LG HDR TVs support both HDR10 and Dolby Vision. Though, some Vizio models were shipped only with Dolby Vision support and are starting to get HDR10 support through firmware updates. Not all TVs can support HDR10 after a few firmware updates, but some, like Sony’s 2015 line of 4K UHD TVs, can. Sony, Samsung, Hisense, Sharp, and most of the other players only support HDR10. Fortunately, just about any HDR TV is going to improve your picture quality over standard 1080p or 4K, so there’s no way to get totally screwed out of a nice upgrade. The two standards will coexist a lot like DTS and Dolby Digital, the two main audio formats that receivers decode. Keep in mind, though, Dolby has always had a lot of support from Hollywood studios, so don’t be surprised if some future content is released as Dolby Vision-only.