Just when I thought that HDR gaming on the PC couldn't get any nastier, this happened.

General
Just when I thought that HDR gaming on the PC couldn't get any nastier, this happened.

Are you ready for a new battleground in the war between AMD and Nvidia? Well, it turns out that depending on whether you are using AMD or Nvidia graphics hardware, the results you get when driving a particular HDR monitor can be very different. As

Monitors Unboxed (opens in new tab) explains, the reasons for this are complex. It is not necessarily that AMD or Nvidia is better. But it certainly adds a layer of complexity to the whole "which graphics card to buy" conundrum. Ray tracing, FSR vs. DLSS, etc. are just a few of the many things to consider when choosing a new graphics card.

The research here focused on the Alienware 34 AW3423DWF. This is a slightly less expensive version of Alienware's 34-inch OLED gaming monitor that ditches Nvidia's G-Sync technology in favor of the more common Adaptive Refresh and AMD FreeSync.

The Alienware 34 AW3423DW without G-Sync (opens in new tab) actually performs differently with HDR. Anyway, the issue involves the performance of the AW3423DWF when using HDR 1000 mode.

This is the mode that must be used to achieve Alienware's claimed peak brightness of 1000 nits, as opposed to the True Black HDR mode, which is only 400 nits. By default, HDR 1000 mode simply increases the brightness of everything on the screen.

This is not ideal in practice. Instead, HDR 1000 increases the brightness of the brightest objects, but leaves the darker objects, which are intended to have luminance levels below 400 nits, untouched. That is the point of HDR, to increase the contrast between bright and dark objects, not simply to increase the overall brightness.

The difference between AMD GPUs and Nvidia GPUs appears when trying to adjust Alienware to achieve more accurate brightness. In the settings menu, there is an option to use "Sawstone Mapping," which essentially sets the monitor to use a brightness curve coded into the source content to dictate brightness.

The problem is that this option is only available to Nvidia GPU users, because according to Monitors Unboxed, AMD GPUs use AMD's own tone mapping, which is part of the FreeSync feature set instead.

To be clear, this problem is not universal to all HDR monitors, but it illustrates the complexity of HDR implementation. Depending on how a given monitor implements the tone mapping pipeline for HDR content, the results can vary widely depending on the GPU used.

Since its introduction to the PC, HDR has been hit or miss. However, this is the first time we have seen reports of dramatic differences in HDR performance depending on the choice of GPU vendor. At the very least, we will keep our scanners' eyes peeled to see how the new generation of OLED and mini-LED HDR screens perform on both AMD and Nvidia GPUs when running HDR content.

If you want to learn more, you can check out the full video (opens in new tab) at Monitors Unboxed.

Categories