For a better-looking image, sceneries are drawn with brighter highlights, more shadow detail, and a wider variety of colors. In contrast to TV HDR, game HDR means more than just a nicer picture: You are more likely to dodge covert enemies and find clues if you can see what’s hiding in the bright and dark areas. But keep in mind that the majority of games are still made to appeal to the middle class: In the middle of the brightness range, you can see everything you need to see.
For the best performance, games still need explicit HDR compatibility, although the addition of Auto HDR in Windows 11 and the Xbox Series X/S changes that: The brightness and colour gamuts of non-HDR games can be automatically expanded by the operating systems. While it’s not the same as having a game that was specifically designed to take advantage of the wider viewing angles, it can give it a boost and make it appear better than it otherwise would.
Why do I want HDR, and what is it?
HDR mixes a number of components to work its magic. First off, it employs a wider range of brightness levels than a standard monitor’s 256 displayable levels, and in certain circumstances, goes beyond a fantastic monitor’s genuine 1,024 levels. Additionally, it includes profiles required to optimally map content’s colour and brightness ranges to the display’s capabilities, a decoder in the monitor that comprehends the mapping, and all other related technologies that hold the puzzle pieces together, not the least of which is the operating system. It also includes more colours than the least-common-denominator sRGB gamut.
Because there aren’t many scenes with extreme brightness or dark shadows, or because the games don’t make use of the wider tonal range, HDR is often not an issue. However, if a game supports it, you’ll likely see better visuals in AAA titles, more creeps in horror titles, fewer ambushes in FPS titles, and so on.
Not whether you want it or not, but how much you want it. The question is how much you’re ready to spend on it—not just for a display that says “HDR10” in its specifications, but for a monitor that will produce the HDR-like visual quality.
Will the Xbox Series X/S and PS5 work with an HDR gaming monitor?
Yup! Under the banner of the HDR Gaming Interest Group, Sony, Microsoft, and a plethora of other relevant companies have even created a set of publicly accessible best practises for HDR game creation and monitor design for their consoles and Windows. However, HGIG is not a standards body and does not certify goods, so you must still carefully review specifications. And the confusion just increases.
HDMI 2.1 issues
Unfortunately, the HDMI protocol has devolved into such a tangle that based solely on the version number, one cannot infer capabilities. Additionally, the specification no longer requires any of the significant new features; in other words, all the glitzy features that made HDMI 2.1 desirable, especially as a choice for consoles, are now optional. Going forward, every HDMI 2.0 connection will be labelled as 2.1a (with the same HDMI 2.0 feature set).
The bottom line is that you must separately confirm support for each feature if you want a monitor for your console that can handle 4K at 120Hz, support variable rate refresh, and auto low-latency mode. The same holds true if you want a PC monitor that can connect through HDMI and support source-based tone mapping, which is covered below, as well as bandwidth-intensive combos of high resolution, quick refresh rates, and high colour depth/HDR.
Manufacturers of monitors are required to explicitly mention the functions that are supported; if they don’t, either avoid the monitor or look further. TFT Central does a good job of explaining the problems if you want the graphic information.
What qualities should an HDR Best gaming monitor have?
Thanks to marketers who expanded the definition of “HDR” to include screens in the most common price range (less than $400), the phrase has become rather muddied. To a certain extent, you must consider a number of specifications to determine whether a device can provide a true HDR experience.
To help you narrow down your options, the VESA display industry organization developed Display HDR, a set of standards and criteria for describing HDR quality levels in consumer monitors. Display HDR 400 is the baby pool of HDR due to its color gamut and brightness requirements, but it’s a nice option if you’re only looking for a bright SDR display.
It is confusing because several manufacturers now refer to monitors as “HDR 600,” for instance. It’s never clear whether they’re just using it as a shortcut for the corresponding DisplayHDR level because they don’t want to pay for the logo certification programme or whether they’re using it in an inaccurate manner to indicate that they can reach a certain tier’s peak brightness level. They can choose not to use the logo and instead take the certification tests for internal verification. (You may do the same with the DisplayHDR Test tool included in the Microsoft Store.)
Gaming with HDR10 and HDR10 Plus
From a technical perspective, HDR10 support is essentially meaningless because it just indicates that the monitor can interpret the data stream and produce it, not that it can render it effectively. The most fundamental need for a monitor to qualify as “HDR” (and the least expensive to incorporate) is compliance with the HDR10 standard. It only indicates that the display is capable of supporting the operating system’s required algorithms for properly mapping HDR content to the monitor’s capabilities.Understanding how to work with compressed colour sampling in video, being able to handle and map colours notated inside the Rec 2020 colour space, and being able to handle the 10-bit computations that mapping demands (for EOTF and SMPTE ST.2084 gamma).
The HDR10 standard’s creators revealed a new level at CES2022, the upcoming HDR10 Plus Gaming standard, a version of the HDR10 Plus that has been on TVs for some time. In contrast to HDR10, which only has a single range that must function across the whole game, Source Side Tone Mapping (SSTM) expands the brightness range on a scene-level based on information inserted by the game developer.
Additionally, it features support for variable refresh rates in 4K at 120Hz on consoles and the ability to immediately activate a display’s low latency mode to offset the increased overhead caused by HDR data (more crucial for TVs than monitors) (still not implemented in the PS5 as of today).
Because the license also covers use rights to certain member manufacturers’ patents, HDR10 Plus needs certification and a licensing fee from hardware manufacturers (including GPUs), but not from software developers. At CES, Samsung declared that all of its gaming displays would feature HDR10 Plus by 2022.
Brightness and color
The brightness of a screen is a measurement of how much light it can produce, typically represented in nits (candelas per square meter). Standard definition range (SDR) desktop monitors normally range from 250 to 350 nits, while HDR displays additionally define a peak brightness that they may reach for brief periods of time and often just for a small section of the screen. At the absolute least, HDR-capable displays should have a peak brightness of 400 nits, while they can now reach 1,600 nits. (Laptop displays, however, benefit from greater brightness levels even without HDR capabilities since they must be able to be viewed in many types of illumination, such as direct sunlight.)
Despite how bright they can go, OLED panels attain almost black-to-black brightness levels, which is what gives them such strong contrast. Contrast is one of the main factors that influence how we judge the quality of an image.
The colour space you’re most interested in for Best Gaming Monitor and displays in general is P3, which comes in two somewhat different varieties: DCI-P3 and D65 P3. In actual use, their sole difference is their white points; DCI is slightly warmer (6300K as opposed to 6500K) and was developed for film editing. Nevertheless, I regularly see DCI-P3 stated in specifications when D65 is intended.
That’s okay since the D65 standard, which Apple championed for its own screens, is the one that matters for gaming monitors. They both have the same gamuts, therefore I just use the term “P3” unless I’m clearly differentiating between the two. (If you have trained eyes, you can distinguish between the two whites, but most people don’t care.)
Additionally, gamuts are sometimes expressed as a percentage of Adobe RGB, which is acceptable. Since printers employ cyan ink, Adobe RGB is somewhat skewed toward the green/cyan end of the spectrum, but P3 extends further out on the green/yellows, which are simpler for high-quality displays to create. And that’s basically why it’s worthless when specifications state “over a billion colours” (the result of employing 10-bit math). Which billion is important.
In order to provide least-common-denominator colour matching in Windows, HP and Microsoft created the sRGB colour space in 1996. This colour space is roughly equivalent to the Rec 709 SDR video standard, so any monitor you take into consideration for decent HDR viewing should unquestionably cover much more than 100% of that range. The preceding chart makes it clear why sRGB-calibrated displays and pictures have terrible greens and everything appears to have low contrast (since it is unable to achieve high saturation levels for the majority of colours).
A competent HDR monitor, in my opinion, should be able to reach a peak brightness of between 600 and 1,000 nits and cover at least 95% of the P3 or Adobe RGB colour gamut, based on my experience. (Windows appears terrible in HDR mode because to the operating system’s poorly built components, sRGB-only colour gamut, poorer brightness capacity, and math.)
All screen technologies, save OLED, use light to create images by passing it through a number of liquid crystal layers and color filters. OLED, however, uses pixels that can illuminate themselves. The majority of backlit panels may show certain abnormalities, most notably the so-called “backlight bleed,” which is actually an artefact of edge illumination and appears as light around the edges of a black screen.
Mini LED, a more recent backlighting method that is excellent for HDR, enables a monitor to employ local dimming like a TV to generate high brightness with minimal bleed and bright halos when they appear close to dark areas; the brighter the display, the more obvious unwanted brightness tends to be. The most recent generation of HDR displays, which have brightness levels of 1,000 nits or more, employ mini LED. More local dimming zones are preferable, much like with TVs.
But the heat produced by all those LEDs might be rather high. Reducing the number of zones from when monitors with tiny LED arrays originally launched has been one tendency. For instance, 2022 monitors will only contain half the zones of the original 1,152-zone versions.
Samsung QD-OLED displays are a more recent innovation that combine Quantum Dot colour rendering technology with a blue OLED backlight to achieve great contrast and quick reaction times while utilising the Quantum Dot array to render a wide range of colours. The Alienware 34 QD-OLED is the first monitor to come with the screen. The AW34 straddles the brightness spectrum by offering both a more constrained 1,000 nit setting and a regular 400-nit HDR option, which is better than it sounds due to contrast given by the nearly perfect black. It gives out some heat, but it doesn’t become nearly as hot as the conventional monitors with 1,000+ nits.
Since prices increase as brightness increases, 400-nit panels are particularly desirable to both buyers and sellers. Additional cost increases may result from adding features like a high refresh rate for Best Gaming Monitor.