views
Is HDR good for gaming?
It depends on what you want to get out of gaming. Whether or not HDR is worth it depends on a few factors. The following are a few things to consider when using HDR in gaming: High-quality graphics: If you like AAA games that have high-quality, cinematic graphics, then HDR might be worth it. HDR makes games look gorgeous. However, if you are into more competitive gameplay, HDR might not be as important. Game support: Whether or not HDR is worth it may come down to whether or not the games you enjoy actually support HDR. Most games that were produced before 2017 don't support HDR. Also, games like Undertale and Minecraft that rely on a more retro graphical style don't support HDR as they don't need it Equipment: To experience HDR, you will need a computer with a graphics card and CPU that supports HDR, or a game console that supports HDR. You will also need a monitor or TV that supports HDR. Not all HDR monitors are equal. A low-cost HDR monitor is probably not going to render as good an image as a more expensive HDR monitor with more features.
What is HDR?
HDR is a video format that expands brightness and color gamut. HDR mostly expands brightness and darkness levels also known as luminance. OLED and LCD displays with local dimming zones can enhance the brightness in certain areas of your screen or turn off completely, making your screen completely dark. This provides greater contrast between brightness and darkness. So for example, if you are watching a nighttime scene, HDR allows the night sky to be really dark (with no backlighting) with bright stars that really pop. Additionally, some HDR formats also expand the color gamut allowing for more shades of colors and better transitions in gradient blends. There isn't just one HDR format. There's actually a few different competing formats. They are as follows: HDR10: HDR10 is one of the main standards for HDR content. Nearly all HDR displays and services with HDR content support HDR10. It is an open standard that can be freely used by any content producer. HDR10 uses what is called "static metadata." This means it uses the same brightness and color profile across all content. HDR10+: HDR10+ is a standard developed by Samsung. Like HDR10, HDR10+ is also an open standard that can be freely used by any content producer, but it's not as widely supported as HDR10. It uses "dynamic metadata," which means it can adjust the color and brightness profile for each scene or even frame-by-frame. This allows it to preserve more detail for bright and dark scenes. Dolby Vision: Dolby Vision is an HDR format made by Dolby. Like HDR10+, Dolby Vision also uses dynamic metadata and can adjust the color and brightness profile for each scene as well as frame-by-frame. It can also adjust the capabilities of your display on the fly. Unlike HDR10 and HDR10+, Dolby Vision is not an open standard. Content creators need a license to use Dolby Vision, and HDR displays need to be certified to claim they are capable it. HGL: HGL stands for "Hybrid-log gamma." It is a standard that was developed by the BBC in Britain as well as Japan's NHK. HGL doesn't use metadata. Instead, it expands on the gamma curve that standard dynamic range (SDR) TVs use to calculate brightness levels. This makes it reverse-compatible with SDR TVs. It is also an open standard that can be used by any content creator. However, it's not as widely used as other formats with very little HGL content available at the moment. It also doesn't reach the same level of darkness that other HDR formats can, so it lacks detail during dark scenes.
What do you need for HDR gaming?
A game console that supports HDR. Most of the newest gaming consoles support HDR. This includes PlayStation 5, PlayStation 4, Xbox Series X and S, as well as Xbox One X and S. Nintendo Switch does not support HDR.
A PC with a 7th-generation Intel Core processor or better. If you prefer gaming on your PC, you will need a PC with a 7th-generation Intel Core processor or better. This includes the Intel Xeon and Celeron processors, as well as Intel i3 7101, i5 7500, and i7 7700 processors or greater.
A HDR-capable video card. You will also need a graphics card that can support HDR. HDR-capable video cards include NVIDIA GTX 10 series or better, NVIDIA RTX 20 series or better, or AMD Radeon 5 series or better.
An HDR-capable monitor or TV. Most importantly, to enjoy HDR gaming, you need an HDR monitor or TV. This is where things get complicated because not all HDR monitors and TVs are the same. Not all HRD monitors are capable of reaching the same brightness and darkness levels as others. Some HDR monitors sacrifice frame rate in order to display an HDR image. Low-cost HDR displays typically don't display as good an image as more high-end displays. The following are some features to look for in an HDR monitor or TV: Support for all HDR formats: An HDR monitor or TV should be able to support HDR10, and Dolby Vision at a minimum. Support for HDR10+ and HGL are an added bonus. Keep in mind that support for a specific format is no indication of the monitor's capabilities. A monitor that supports HDR10+ means it can decode and display HRD10+ content. It doesn't mean that it's capable of reaching the same luminance and color levels that HDR10+ is capable of reaching. Peak brightness levels: A monitor's peak brightness is measured in nits or candela per square meter (cd/m Black levels: Black levels measure how dark a TV screen can get. Dolby recommends a TV that can reach black levels of 0.005 nits or lower. Intel recommends a dark level of 0.44 nits per millimeter. Color depth: Standard dynamic range monitors have a color depth of 8-bit. This means they can display as many as 16.7 million colors. A true 10-bit monitor can display as many as 1.07 billion colors. However, most HDR monitors are not true 10-bit monitors. Many monitors and TVs that are marketed as 10-bit monitors are actually 8-bit monitors with 2 extra bits (8+2-bit) for dithering. An 8+2 bit monitor is still going to be able to display a better image than a standard 8-bit monitor, but it is not a true 10 bit monitor. True 10 bit monitors are pretty rare.. Color space: Color space defines the spectrum of colors that can be displayed, rather than just how many colors can be displayed. sRGB is the common color space that is used to measure standard dynamic display monitors. HDR monitors use DCI-P3 as their color space, which is a much wider array of colors. A good HDR monitor should be able to display at least 90% of the DCI-P3 color space. OLED or LED with local dimming: Most LCD screens use LED lights to backlight the panel. However, when the screen is dark, some of the backlighting will usually leak through. To combat this, many LED TVs and monitors have local dimming zones, where areas of the backlighting will dim or turn off in order to make the darks on the screen appear darker. The more local dimming zones your display has, the better. With OLED displays, each pixel is capable of producing light on its own without the need for a backlight. OLED displays are capable of reaching greater levels of darkness and brightness than a standard LED monitor. Refresh rate: Finally refresh rate is something to consider when shopping for an HDR display for gaming. Refresh rate determines how quickly the monitor can produce a new frame. Higher refresh rates allow for more precise motion and response time in games. A gaming monitor should have a refresh rate of 60 Hz or 60 frames per second at a minimum. That's the standard for the newest game consoles. Many PC gamers prefer higher frame rates of 144 Hz or higher. However, an HDR monitor that is capable of 144 Hz is going to cost double or triple the price. Finding a monitor that has all the features you want at a price you can afford is a bit of a balancing act.
Comments
0 comment