Displays

HLG and HDR10: What Are The Differences And Which Is Better?


We’ve talked a lot about HDR or High Dynamic Range around here because it is important. It is probably the biggest advancement we’ve seen in display technology since the advent of HD. Certainly more important than putting more pixels on the screen. But there are a lot of different types of HDR out there. We’ve covered Dolby Vision fairly extensively, but we’ve only mentioned HLG in passing. So what is HLG and how does it differ from other HDR formats like HDR10? Let’s explore.

HDR10

HDR10 is considered to be the foundation on which most of the other HDR formats are built. It is an open format which means it is free. Unlike Dolby Vision, there is no licensing to use it. Therefore, many of the other formats (including Dolby Vision) are built on top of it.

The key to HDR10 is that it is supposed to be mastered to a specific nit level. Nits are a measure of brightness. While HDR10 can include up to 10,000 nits, most content is mastered to somewhere between 1000 and 4000 nits with 1000 being the most common (as many displays can hit 1000 nits).

The Problem With HDR10

The issue with HDR10 is also one of its greatest strengths. It’s free so anyone can access it. The downside is that it is free and there really isn’t as much training and regulation as there should be. That’s why Dolby Vision is considered superior. Not just because it has metadata with frame-by-frame brightness information, but because there are controls that ensure quality.

HDR10 files have been found to have incorrect mastering data (the file says it was mastered to 1000 nits when it was really mastered to 3000) or have missing mastering data. If you don’t have a display that can do dynamic tone mapping, you may end up with an image that not only isn’t HDR, but looks worse than an SDR image!

HLG

Hybrid Log Gamma or HLG is much different than HDR10. HLG (sometimes listed as HLG10) shares some similarities with HDR10 in that it is free and it is an HDR solution. The key difference is implementation. You’ve likely seen gamma as a setting on a display in the past. Setting the correct gamma curve could make a big difference in the contrast of your display.

HLG is basically an extension of your normal gamma curve. While the “bottom” of the curve can be interpreted by normal (non-HDR) displays, the top of the curve can be interpreted by the HLG-capable displays. This is why HLG is preferred over HDR10 by broadcasters. HDR10 is not compatible with non-HDR displays while HLG is. If anything, you could think of HLG as encompassing SDR, HDR, and any metadata that the other HDR formats would have.

The Problem With HLG

There are a couple of issues with HLG. First, it is limited to 10-bit color depth while HDR10 can have 12-bit. It has also been lagging behind HDR10 in being integrated into displays. While it usually only takes a firmware update to enable HLG, HDR10 (and the other HDR formats) have already seen wide adoption. Because of the lower bit depth, HLG has smaller files and has really only seen wide adoption in broadcasting. The backward compatibility has also made it much more attractive to broadcasters.

Conclusion

In the end, HLG and HDR10 both have their advantages and disadvantages. All things being equal, HLG is a format that should have similar quality with much wider compatibility than HDR10. Because HDR10 has already seen such wide adoption, it is unlikely to go away any time soon.

Have you used HLG? Tell us your experiences in the comments or on our Facebook page!


Leave a Comment

Your email address will not be published. Required fields are marked *