Is hdr for monitors worth it?

Susan Fernandez November 11 2021

What is HDR?

HDR stands for High Dynamic Range. It is a way to display images that are of higher quality than what can be done with conventional monitors.

When artists used photoshop or illustrator they usually work in layers, each layer has different effects and properties which makes it unique from the other layers. For example, there is a layer for contour lines, another one for backgrounds, etc.

But these elements will never overlap in the picture because if they did, one of them would cover up or eliminate the other completely. (Some people use blend modes like Multiply or Overlay but even those don't compare to real overlapping)

Real overlapping brings us to HDR monitors because you could literally overlap all of those unique elements over the same scene/image/whatever and they would all be visible at the same time. That way, instead of taking lots of layers to create a realistic image, there is only one layer that needs to be edited for perfect results.

The same applies to video editing as well as it allows you to overlay different videos on top of each other without eliminating them from the scene. This is why HDR monitors are great for content creators because it saves a lot of time working with photoshop/illustrator/after effects etc.

How does SDR relate to it?

Short for Standard Dynamic Range (SDR) is what you have if you do not have an HDR monitor/tv nor is it connected to one. Conventional TVs are SDR display systems that produce low-resolution images without being able to overlap or overlay different elements of the scene in order to achieve a perfect result.

Pixel density refers to how many pixels are packed in each inch of space on a screen. A good example would be having pixels close enough so they overlap when displaying the same image on an HD screen and then adding more pixels so they can pack more information over the same spaces, thus achieving higher image quality.  

What are the image woes of HDR compared to SDR?

The first one has to do with colors, they are all over the place so you need a professional calibration unit in order to get the best results.

Another issue is gamma value or basically how bright/dark an image is supposed to be. It takes some time getting used to but when you start seeing it in-game, it does take away some details that were supposed to be visible.

One more thing would be highlighted because having them look like normal elements makes it harder for our eyes and brains to process what's going on in the scene since there are no "light beams" coming towards us which usually happens when taking photos outside during sunset or even using your phone camera without a proper HDR filter.

However, making realistic-looking scenes is quite a challenge since you have to make the changes visible but not too obvious which is way easier said than done... But even if you do manage to pull it off it still looks odd because our brains are used to seeing those effects in real life so they will always look different compared to what we normally see every day.

What does this have to do with gaming?

Well, basically everything! Since almost all games nowadays run in Full HD (1080p) or even lower resolutions, gamers don't need any fancy to achieve high frame rates. But what they do need is a monitor that supports HDR.

If a monitor can display what is happening in real life, you would have a lot more freedom to change the lighting of the scene and make it look as realistic as possible without needing too many layers/ingredients. Even if a game does support HDR but you don't have an HDR monitor then there is very little point in supporting it from my point of view.

HDR formats for your monitor

There are two formats for displaying HDR on your monitor: HDMI 2.0 and DisplayPort 1.4

HDMI 2.0 is the cause of all our problems when it comes to getting an HDR monitor because not many GPU's support it (the ones that do, don't really work properly)  and if they do, you still need your PC to send an HDR signal which isn't the case in most cases.

DisplayPort 1.4 is currently the most common format for HDR monitors since it doesn't require any specific changes to be made by GPU manufacturers but then again, most people don't have a DP 1.4 capable monitor so here we go again with the whole plug/unplug the thing.

The good news is that there are already DP 1.4 monitors out there but that doesn't mean they're cheap (well, let's face it, nothing that costs more than $200 is ever cheap), and not every game supports HDR so it could be hard to find one monitor for everything... But hey, at least you will have lots of money right?

Standards by VESA 

It all started with VESA's DisplayHDR Standard and the first monitor that came out was ASUS's ProArt PA32UC-K. It supports up to 1,000 nits of brightness which is still less than what some other monitors support but it does have an excellent color gamut coverage which gamers will appreciate since they want accurate colors rather than just "bright" ones.

Later on, other companies added their own logo so keep in mind that not every monitor that has 400 nits or more should be treated equally.

Hands down, the best feature about HDR is being able to see details in highlights which are usually lost when playing games especially if you are using a normal SDR TV... That's probably why this technology is used more in TVs than on monitors but since we're living in an era of "things that make you go faster" it won't be long before there are HDR-enabled gaming monitors.

Until then, the only way to get a good HDR experience from your monitor is to have a proper GPU which supports either HDMI 2.0 or DisplayPort 1.4 so if you haven't made the jump yet, now might just be a good time to do so.

What is local dimming?

Basically, it means that you can control the intensity of other LEDs on your monitor to create darker areas where needed. The great thing about this feature is that it doesn't require any special hardware or other external devices.

However, the local dimming isn't free since not all monitors have this feature and even if they did, the quality varies from one model to another... Some reviewers said their LG 32UD99-W looks great while others say their Samsung CHG70 suffers from backlight issues caused by yellowing which cannot be fixed by calibrating/calibrators who are supposed to fix stuff like that.

VRR technology

Variable refresh rate is a new technology that helps you reduce interference between your PC and monitor. It allows the monitor to change its refresh rate based on the frame rates you are getting so basically, if your FPS drops below what is supported by VRR then it will display at a lower frame rate which should fix any tearing issues... But there are two main problems with this technology right now:

For starters, not many monitors have it or don't support standard formats yet (FreeSync for AMD cards and G-SYNC for NVIDIA cards). Anyway, since none of them currently work correctly there's no point in rushing into things without doing research first.

Also keep in mind that playing games with an unstable framerate isn't much fun but there are other cases where VRR might be useful for e.g.: educating yourself through video courses where the video is being played at 25 FPS but your PC can handle 60 FPS.

What are the prices for monitors hdr?

They aren't cheap since they usually feature a VA panel, an IPS panel would make them more expensive which is why manufacturers don't even bother making HDR monitors with that type of panel. Furthermore, since this technology uses a lot of power and needs to have local dimming built-in as well as a wide color gamut coverage you're looking at fairly pricey monitors.

How do you know if a monitor has an hdr?

You can check the specifications but keep in mind those change from manufacturer to manufacturer so it's best to go on popular review websites and read some reviews about those specific products - In other words: do your homework before buying anything.

Is the picture quality equivalent to seeing an image in real life?

No, it's better because you can achieve perfect overlaps between different elements of the whole scene instead of having them separated into different layers as we know from photoshop and illustrator. You could compare SDR to watching tv and HDR to actually being there in person.

Can you compare HDR quality to pixel density?

Yes, or at least try since it is basically the same but with different results. In my opinion, understanding this topic is way more important than just saying "HDR brings better image quality" because it actually does but doesn't expect to put your old 1080p monitor into the trash bin right away when you get an HDR one because there are still some limitations and issues it has so don't treat them as perfect replacements.

But they do have a lot of potential in both gaming and content creation! Especially with 4k monitors being so common nowadays, having a proper HDR monitor will give these games a whole new life.

How will HDR change your gaming experience?

In most games, there isn't much difference when turning on/off HDR but I would say that finding a monitor with good colors and contrast ratio should be your priority instead of just going for a DP 1.4 or HDMI 2.0 monitor because those things do affect the quality.

Having an SDR monitor and an HDR one side by side really shows the difference and lets you choose which image quality is better for what you're doing... This also brings us back to the topic of having an SDR monitor as a secondary device.

What kind of hardware do you need?

No more than we already have, we just need a compatible GPU and an HDR monitor that supports at least HDMI 2.0 or DisplayPort 1.4 (there are also FALD monitors but those cost more) but if you plan on buying a new GPU then go ahead and get one that does support it because this whole "HDR craze" isn't going anywhere anytime soon!

Is it worth it?

It depends on what you're trying to do and how much you're willing to spend. I'm pretty sure that even people who say "I don't need HDR because I'm blind/have a standard monitor" will appreciate it once they get their hands on one... But if all you care about is having accurate colors then go for an SDR monitor instead.

Another thing worth mentioning is that most Monitors with hdr support require DisplayPort 1.4 or HDMI 2.0 which means your current video card might not be able to handle these modes but we can expect support in next-gen cards as well as the general availability of the supported monitors and games (which brings us back to the pixel density comparison).

There are also upcoming technologies such as VESA's Adaptive Sync which is a direct competitor to AMD's FreeSync and NVIDIA's G-SYNC (of course, adaptive sync tech only works if the monitor has low response times)

As for hardware requirements, you don't need much more than an intel core i3 6100 which costs around $120 along with an HDMI 1.4 or DisplayPort 1.2 video card like GTX 1050ti (released in December 2016).