High-dynamic-range (HDR) video is gaining momentum. Some of your favorite movies are already available with improved color and brightness and look even better than they did in their original theatrical versions.
But some remasters have made critics cry, igniting a debate about technical ability and artistic intent.
What are the benefits of HDR?
Before we consider whether the term “false HDR”; is even justified, it is important to understand what HDR video is. As the name suggests, high dynamic range video has an increased dynamic range compared to SDR (standard dynamic range) content.
Dynamic range is the amount of information visible in an image or video between the brightest highlights and the deepest shadows. Modern HDR video is delivered in 10 bits per channel, as opposed to eight bits per channel in SDR. This means that SDR can display 256 shades of red, but HDR can display 1,024.
This means that more color information is visible on the screen, which is closer to what we would see in real life. More shades of a certain color also make ugly “bands” on slopes less prominent. The difference is most visible in fine details, such as clouds or areas with subtle color variations.
HDR also adds luminance or peak brightness. The vast majority of HDR-compatible TVs have the basic built-in HDR10 standard. It stipulates that content should be mastered at 1,000 nits, as opposed to the traditional 100 nits (recently revised to about 200) for standard definition content.
This means that bright objects, such as the sun, a flashlight or a shot, can actually appear when displayed on an HDR-compatible screen. The extra brightness makes elements like these look much closer to what they would look like in real life, creating a more immersive viewing experience.
HDR video is something you have to watch to really appreciate, but its improvement over SDR can be huge.
RELATED: HDR Format Wars: What’s the difference between HDR10 and Dolby Vision?
What is “Fake HDR”?
The term “Fake HDR” has been thrown around YouTube, Reddit and other platforms in the wake of some high-profile Blu-ray releases. It refers to the studios’ reluctance to rate their HDR productions to sufficient peak brightness and make the images appear.
According to Vincent Teoh, a professional display calibrator and reviewer, 4K Blu-ray of Star Wars: The Last Jedi hits a maximum peak brightness of 250 nit, with the sun only graded to 200.
Teoh also found that Blade Runner 2049 4K Blu-ray rises just over 200 nits, making it an “SDR movie in an HDR container.”
These HDR editions use a 10-bit (12 in some cases) color depth. This means that they still deliver a better quality than SDR. But because they lack the flashes of maximum brightness shown in many other productions, some perceive these releases as “fake HDR.”
As another reference, a super bright LCD, such as the Vizio P-Series Quantum X, can reach a maximum brightness of over 2000 rivets. LG’s relatively “weak” OLED panels can also handle around 700 nits. Some reviewers and Blu-ray collectors feel that these “fake HDR” releases have ended up due to an overwhelming brightness.
This does not mean that a movie looks bad; the image just does not jump off the screen as in other editions. Since these are major releases from some of Hollywood’s largest studios, it’s obvious that colorists and directors know exactly what they’re doing. The reluctance to splash out on HDR effects is intentional.
However, whether this validates the term “false HDR” is still a matter of opinion. Blu-ray packs do not contain information about maximum luminance, and most buyers would not understand the terminology anyway.
So movie fans have to rely on reviewers like Teoh, who have access to HDR mastering tools, to get the whole story.
HDR standards and creative purpose
Two factors have contributed to the situation we covered above: the technical limitations of modern screens and creative intentions.
HDR video has not yet been standardized in any meaningful way. The closest thing to a standard is the HDR10, which now has good support from both TV manufacturers and film studios. While the HDR10 is standard to be mastered at 1000-nits top brightness, not all TVs can reach these levels.
A screen that cannot hit the high targets tones an image that exceeds its capacity. However, bright elements will still be influential thanks to the contrast between the highlights and the shadows. But directors also rely on a screen’s ability to map correctly, which adds a risk element. Will each screen do it right?
The alternative is to rate your movie so that it does not exceed the capacity of most screens. An image that is classified more conservatively, with bright elements with a ceiling of 200 or 300 rivets, will look less powerful and vibrant. The result is that you get a fairly consistent image across a large number of screens.
The Wild West by HDR standards has also created a format war between competing technologies, such as Dolby Vision and HDR10 +. These modern HDR standards use dynamic metadata to help TVs adapt per scene or frame by frame. However, the standard old HDR10 has no dynamic metadata, so your TV only has to decide for itself.
Then there is the question of creative intent. Some directors may decide that they do not like HDR or rather use HDR to dazzle viewers with bright highlights. The benefits of HDR for these professionals are color volume and accuracy, not the extra luminance that the latest TVs provide. However, it is worth noting that many directors use HDR and maximum brightness to the fullest.
However, it is difficult to argue against someone’s creative vision. Black and white films were still produced long after color became standard. Some directors still shoot on 35 mm film or in a 4: 3 aspect ratio.
Are these decisions wrong? Do viewers get it wrong when they wonder what a movie would look like if it had been taken with all the technical bells and whistles that were available when it was created?
Food for thought, really!
RELATED: HDR format compared to: HDR10, Dolby Vision, HLG and Technicolor
Movies that are definitely HDR
If a movie is released on Blu-ray in HDR10, Dolby Vision or a competing format, it’s about as good as you can get until the studio decides it’s time for a remaster. If you are upgrading from DVDs or regular Blu-rays, the hope for 4K and a wider range of colors is still a good incentive.
Choosing your favorite movies based on their technical specifications is like choosing your favorite books based on the font. It can really affect the overall presentation, but the underlying story, dialogue and other elements remain the same and are just as fun.
If you buy Blu-rays for their HDR features, you may want to save your money and simply avoid those that do not meet your expectations. Unfortunately, there are not many people out there with access to the professional tools Teoh uses, so the information is at this point.
At the moment, you just have to stick to watching the “good” HDR productions, which Mad Max Fury Road (almost 10,000 nits), The Greatest Showman (1500+ nits) and Mulan on Disney Plus (900+ nits).
Shopping for a new TV to watch your HDR movies on? Watch out for these six common mistakes.
RELATED: 6 Mistakes People Make When Buying A TV