HDR: De-Confusing the Standards

RedShark Replay: For a new technology that looks set to be deployed without any damaging format wars, there is still plenty of confusion regarding HDR standards and what you should be looking for when thinking of purchasing a new TV set. Here’s the information you need.

HDR is most certainly the next ‘big thing’. Unlike resolution increases, HDR promises to offer a very tangible improvement in viewable image quality, at least on the right monitor and in the right viewing circumstances. But HDR is not always what it seems at first glance, especially when it comes to the process of choosing a monitor or television. There are various competing standards and specifications on offer, which drastically effect the viewing experience that you will have. It is enough to give even the most ardent video experts a headache.

If you are in the market for an HDR display to view content, I will attempt to clear up some of the confusion here in a way that I hope mere mortals can understand. So forgive me for not delving fully into the deep technical minutiae of the EBU and SMPTE standards, I will leave that to our technical guru, Phil Rhodes, to do at some point since I want to give an overview in a nutshell of what HDR is really all about when it comes to choosing a display.

The Brightest Whites Doorstep Challenge

In order to understand the effect our choice of display will have, we need to understand a few basic points about HDR. One of the popular conceptions is that HDR offers much more screen brightness. The theory being that having a brighter display means that the brightest whites can be viewed in a much more realistic way. This is correct, sort of, but glosses over a lot of the reasons behind it, as well as what the brightest whites really are. Or even whether we are really talking about white tones at all! HDR is not, as it would seem, about simply showing what you see in a conventional picture, but brighter. And this is an important point to be borne in mind when it comes to your choice of display.

Regardless of the HDR standard being used, one of its main purposes is to give you greater displayed dynamic range. As we know, the main problem that we have with image acquisition is with regard to highlights. On a standard display, reference white is specified in existing specifications at 100 nits. So far, so simple, and one would therefore assume that on a good HDR display, this same white would now be well over 1000 nits brightness.

Not so. And this is the crux point about HDR as it stands. That same white level in HDR will still roughly be the same brightness as it is on a standard display, as in fact will the average picture brightness overall. The reason for this is because HDR is designed to handle the detail in highlights much better. Take a look around you. Even in bright sunshine, is the white piece of paper on your desk as bright as the sun, or as bright as the specular highlight reflecting off a polished metal object? Of course not. And so what HDR gives us is the ability to replicate that difference in brightness, and the detail contained within, much more realistically and accurately.

In other words, the average brightness of subjects such as human faces, room illumination etc in HDR wont be that much different to SDR (Standard Dynamic Range), when graded well. In fact most of the tonal space in the specified transfer function curves is given over to these aspects of the picture still. But what HDR does have is much, much more headroom available in those bright areas of the picture above the conventional 100 nit level, which results in greater creative freedom in the grade, as well as making textures, for example sea water with glinting sunshine, or the surface of polished textured metal such as copper, much more realistic. Bright light shining on a human face, too, will be made, finally, a creative possibility without losing skin detail or essence of colour.

Much like having greater dynamic range in sound, with more headroom for loud noises such as explosions, and the ability to make intelligible sound at lower levels, HDR gives the opportunity to be much more subtle, as well as ‘in your face’ as and when required.

But this very fact causes a bit of a headache when it comes to choosing the right display, and sorting through the different capabilities. Current display technology is still quite limited. Currently, we still have to choose between OLED and LCD technology. As we will see later, the various competing standards for HDR have very different approaches to displaying such images. And not all of them will be suitable for a general viewing environment.

Because of this, when you choose a display to view HDR with, you are faced with a bit of a choice. At least with current display technology. With regard to current HDR you have a choice of getting a display with a lower maximum luminance specification, but better blacks (OLED), or a higher luminance capability and slightly higher blacks (LCD). Yer pays yer money, yer takes yer choice, and it is for this reason that the UHD Alliance specifies two different display luminance standards to account for this.

Read Original Article at RedShark News