Everything you need to know about HDR in computer monitors. HDR technology in computer games Hdr monitors is there a difference

Everything about HDR in smartphones: what it is, why, and where it occurs

Before the advent of HDR (High Dynamic Range), manufacturers were chasing an increase in screen resolution and an increase in pixel density, and even now this trend is still observed. But HDR is a much more useful technology: even high resolution does not affect the quality of the picture as much as HDR does.

Extended dynamic range allows you to make the image much brighter, more realistic and detailed, especially in its lightest and darkest fragments, which are “overwhelmed” by a regular SDR screen. These areas are no longer just spots; details can be seen in them. Shades receive more halftones, the image gets rid of “acidity” and becomes more voluminous and pleasant to perceive.

Learn more about how HDR works on TVs. The human eye is usually capable of perceiving wider gradations of color and brightness than an SDR screen can convey, and new technologies help reproduce images without loss of quality, with high brightness and detail.

So, if you take an image of a person standing against the background of a light area, then a regular display will show a relatively light and even overexposed background, but a 4K + HDR TV will allow you to see the details against this background. Or let’s take a picture of a sunset: with orange stripes across the entire sky, a shadowed general shot and the brightest, almost illuminated sun, which goes beyond the horizon. On an SDR screen, the general plan will be barely distinguishable, the stripes will merge into a single whole, while with the help of HDR the picture will become more detailed: you can see the contours of the sun, houses or trees will be visible in the shadows of the general plan, and each stripe in the sky will be well drawn.


On the left is an image on an SDR screen, on the right is on an HDR screen

By the way, there is an opinion that to understand all the benefits of HDR you need to watch only the relevant content. But in fact, ordinary video images look better on a display with extended dynamic range than on an SDR screen - the picture is brighter and more contrasty. But, of course, all the advantages of this technology are noticeable only when playing the corresponding content.

We can’t help but mention HDR formats - they are supported not only by TVs, but also by mobile devices. There are currently two popular standards on the market: HDR10 and Dolby Vision. HDR 10 is the most common open format, in which the user can change the brightness and other image settings. It is supported by default in 4K Ultra-HD Blu-Ray discs (including game consoles). The standard specifies a color value of 10 bits per RGB channel and a peak brightness of 1000 cd/m2.

The Dolby Vision format was created by Dolby and is called the standard of the future. Self-configuration by the user is already excluded - it is assumed that the video is already perfectly calibrated. A TV that supports this format is equipped with a special hardware chip. 10-bit color for Dolby Vision is the minimum, and permissible value is 12 bits. Dolby Vision content is mastered with a peak brightness of 10,000 cd/m2, but taking into account the capabilities of modern equipment, it is at a level of approximately 4,000 cd/m2.

As for HDR content, the area is now actively developing. Series and entertainment content in Dolby Vision format are produced by Sony Pictures, Universal, Warner Bros. and other companies, it can be found on Amazon, Netflix, Ivi.ru, Google Play, iTunes and YouTube. Many TV series are already filmed using HDR.

Modern consoles (,) and even some computers support the HDR10 format, especially when it comes to high-budget games.

HDR in mobile devices

IN mobile devices HDR technology is also aimed at making visual content more detailed and vibrant, but the user will not get the full experience of watching a video on a 65-inch TV screen. However, even on small displays, improved color rendering, high detail in sunny and poorly lit scenes, and the absence of glare are clearly visible.

Modern smartphones (though there are few of them yet) support HDR10 and Dolby Vision formats, but new standard specifically for mobile devices Mobile HDR Premium - it was developed earlier this year by the Ultra HD Alliance. The standard implies that the device must have a screen resolution of 60 pixels per degree, support 10-bit video, have a dynamic range from 0.005 to 540 cd/m2 and cover up to 90% of the DCI/P3 color gamut. Content that follows this standard will be marked with an appropriate icon.

Some people are skeptical about HDR on small screens: it is believed that it is not always possible to appreciate all the benefits of a smartphone display with a wide dynamic range, especially if we are talking about an OLED matrix. After all, with a smartphone we don’t sit in one place, as usually happens when watching TV, but we move around - the lighting and viewing angles change. The display can also automatically change brightness depending on lighting conditions. However, manufacturers assure that these features are taken into account by the software and hardware - the user is guaranteed the best visual perception. Well, under normal conditions, we usually watch video content on a smartphone at a right angle, so the disadvantage associated with changing viewing angles can be called insignificant.

Another complaint is that high peak brightness requires increased battery drain. But in reality, the average device needs to be charged every evening, and the decrease in battery life is noticeable if you watch HDR content non-stop. In addition, top models are usually equipped with fast charging technology.

Most likely, HDR in smartphones will become a standard for flagships in the near future. In any case, the technology has good prospects: unlike 3D, which few people need now, HDR provides real advantages in terms of picture quality. And there has clearly been a trend toward an increase in streaming services that support HDR.

Smartphones with HDR

Some manufacturers, as we said, have already implemented HDR technology in the screens of their top models - we are primarily talking about LG, Samsung and Apple. Let's talk briefly about smartphones whose displays already fully support extended dynamic range.

Samsung tried to launch HDR in Galaxy Note 7, but in the end the first properly working smartphone with full extended dynamic range was the LG G6 (). It supports both HDR10 and Dolby Vision standards, just like TVs from this brand. Notable here is the IPS screen with Quad HD+ resolution, which has good viewing angles and, therefore, eliminates the hypothetical disadvantages of HDR.

The “cherry on the cake” was HDR10 support for the new smartphone from the South Korean company LG V30, as well as its modification V30+. The device retains the 18:9 aspect ratio taken from its predecessor - for example, films in 21:9 format have much smaller frames than on displays with a 16:9 ratio. By the way, on the POLED matrix used here, these frames are invisible. V30 display with a resolution of 2880x1440 pixels. capable of displaying 148% sRGB space coverage and 109% DCI-P32 space coverage. The devices have not yet appeared in Russia and are expected by the end of the year.

The new iPhone X () received an OLED display with support for HDR10 and Dolby Vision, as well as True Tone technology, that is, it not only provides visual accuracy of HDR, but also changes color temperature depending on the lighting. The contrast here is stated to be very high, 1,000,000:1, brightness is up to 625 cd/m². By the way, full support for HDR was also expected from the iPhone 8 and 8 Plus, but Apple said that users of these smartphones will be able to see improved dynamic range, contrast and a wider color gamut when playing HDR content, but only the iPhone X display will be able to show the real capabilities of this technologies.

Flagship screens Samsung Galaxy The S8 and S8+ were the first to be certified to the aforementioned Mobile HDR Premium standard, and they also support HDR10 and Dolby Vision. Both smartphones have bright AMOLED displays with highly detailed images and a resolution of 2960x1440 pixels.

***/HDR/galaxy.jpg

From left to right: Samsung Galaxy Note 8, S8+ and S8

But if the peak brightness of the Galaxy S8 is about 1020 cd/m2, then the new Galaxy smartphone Note 8 () this figure is already 1200 cd/m2. Its 6.3-inch AMOLED display It has a color gamut that is almost 1.5 times larger than sRGB, and HDR movies will look more interesting on this screen than on the S8 and S8+ display, although the difference is not as significant on mobile devices.


Matrix HDR support Price
LG G6

2880x1440 pixels

from i 39 990
LG V30/V30+

2880x1440 pixels

HDR10 from i 50 000
Apple iPhone X

2436x1125 pixels

from i 79 990
Samsung Galaxy S8/S8+

AMOLED 5.8”/6.2”

2960x1440 pixels

Mobile HDR Premium

from 49 990/ 54 990
Samsung Galaxy Note 8

2960x1440 pixels

Mobile HDR Premium

from i 59 990

HDR is coming. Let's figure out what it is and how it applies to video cards, monitors and PC games.

Everyone is talking about "HDR" these days, and if you're concerned about graphics settings in PC games, you may be wondering what's so big about it. It's been around for several years! Well, I have news for you: all those checkboxes and options labeled "HDR" that you've been seeing in games since Half-Life 2 are not true HDR.

And that option on your cell phone camera marked “HDR”? It’s also not exactly HDR, although, like those fake HDR checkboxes in game settings, the goal is the same: to improve image quality.
Confused? Don't be alarmed; When it comes to misconceptions about HDR, this is just the tip of the iceberg. So what is it and how can it be useful for video games? The first question is easy to answer, but the second one will take a little time. Read our article and you will find out what HDR is and what graphics card and monitor you need to take full advantage of it.

No, this is not true HDR.

HDR (High Dynamic Range) is a general term for image and video technologies that have a range of brightness that exceeds the capabilities of standard technologies. Despite what you may have heard from the “prophets of 4K,” resolution does not play the main role in creating a high-quality picture. Outside current restrictions As Windows scales to 110 DPI, the number of pixels becomes less important than what you do with them.

Contrast, brightness and color saturation all become important once the desired resolution has already been achieved, and it is HDR that improves their parameters. And this is not some gradual upgrade; the strict requirements of HDR will require new hardware from almost everyone, and you won't need benchmarks or a trained eye to tell the difference.

For starters, the minimum HDR specification requires an LCD monitor with a brightness of at least 1000 cd/m2 or nits to meet the new "Ultra HD Premium" standard. Top gaming monitors, whose brightness reaches 300-400 nits at best, don't come close to this. Good laptops they didn't go far, squeezing about a hundred nits more out of their displays. Even mobile phones, with their fantastic, everything-can-be-seen-in-the-bright-sun screens, can only boast of eight hundred nits. Yes, when it comes to brightness, HDR-capable displays leave all these “competitors” in pitch darkness.

HDR also improves color reproduction, since it requires 10-12 bits per color channel, which is fully achievable on different operating systems and standards. Majority personal computers display only 6-8 bits per channel using the sRGB standard, barely covering a third of HDR's native visible spectrum. And although the hardware allows you to achieve more, it is the features of the software that make it inconvenient and undesirable.

sRGB (right) displays only a third of the available HDR colors.

Currently, desktop monitors that support WCG color enhancement technology are usually used for professional purposes, such as photography or medical research. Games and other software simply ignore all these extra colors, which makes them look like they're out of tune on WCG monitors, unless the hardware starts to emulate the missing colors. HDR helps eliminate confusion by including metadata in the video stream so that colors appear correctly in applications and the monitor's capabilities are used optimally.

WCG is just a part of HDR technology. For example, Eizo's professional ColorEdge series supports WCG, but the color reproduction is not even close to HDR-compatible, despite all the quality, cost and excellent reputation of the former.

To cope with the additional data volume, HDR requires an HDMI 2.0a monitor connection; belated HDMI 1.4 upgrade with low bandwidth.

HDR is a very promising thing, but its path is thorny. The biggest problem is not technical woes, but competing, often incompatible standards that threaten to lead early adopters into costly dead ends.

And it's not true HDR either.

There are now two main HDR standards: the proprietary Dolby Vision with 12-bit color and dynamic metadata, and the open HDR-10 standard, which supports 10-bit color and provides only static metadata at the start of the video stream.

Dolby Vision, with all its licensing fees and additional equipment, is the most expensive option, but also very difficult to use. Most display manufacturers and content providers have decided to turn their attention to HDR-10, and Microsoft is especially worth highlighting with their Xbox One S and Sony with PS4. A striking example that the gaming industry does not stand still. All of this makes HDR-10 the easiest and most obvious choice for console gamers.

Dolby Vision supporters point out greater color depth, more demanding hardware, and the ability to dynamically adjust color and brightness frame-by-frame, along with HDR-10 compatibility. However, the market follows a cheaper competitor that satisfies quality standards.

Since HDR-10 is not compatible with Dolby Vision, this makes the latter a technically superior, but more expensive and licensing standard that will die in the next couple of years, while its open companion will thrive. But, if you are an avid movie fan who is willing to spend a lot on quality, then you can enjoy the more advanced brainchild of Dolby, nothing will stop you. Gamers, however, have different priorities. Okay, enough about living rooms. What about HDR on PC?

PC makers are racing to join the new HDR race, but you don't have to wait for the winner. The top TV segment has already received a hardware solution for HDR. This is also The best way see the difference firsthand as a small but growing library of relevant demos, movies and TV shows is now available with side-by-side comparisons. All of our recommended TVs for gaming support HDR, making them a great choice for replacement monitors if you value size and picture quality over the traditional low response times of monitors.

TVs like Samung's KS8000 and KS9800 series models can easily serve as computer displays and be on the crest of the wave of pioneers of HDR content. Just keep in mind that the price of admission to this club is quite high, and using TV for gaming has some disadvantages that you need to be prepared for.

Video cards

Where PCs are already ready for HDR is in the graphics accelerator market. While monitors haven't kept up with their TV competitors, mid- and high-end GPUs have been poised for a revolution for a year now, thanks to the healthy rivalry between Nvidia and AMD.

Nvidia started talking about HDR back in the days of the 900 series cards, and all of their modern Pascal accelerators are already ready for HDR. AMD came into the game a little later with the 390X, and their Polaris line is the first red-black cards to support HDR. If you recently purchased your graphics card, there's a good chance that at least part of your system is ready for a change.

And this is precisely the question that worries PC gamers the most. While new ones will come out with HDR support, the old ones will not be able to show off their expanded color palette and contrast without additional patches. They will run fine on new systems, but without fresh patches you won't see any difference.

Luckily, adding HDR doesn't require any major code rewrite. This is achieved through a fairly simple process that expands the SDR color gamut into the HDR range without much effort.

HDR on the left, normal screenshot on the right. Nvidia is working on adding HDR to Rise of the Tomb Raider for PC. Expect a lot of patches for popular games once HDR displays become widely available.

This means that popular games in the future may get remasters or patches that add HDR support, and the modding community will go even further, towards classics that developers won’t even look at. Unlike simulated or mild HDR, previously used in games or photography, “iron” HDR makes a very noticeable difference in image quality, which the entertainment industry has already noticed. And yet, we are still far from a bright and colorful nirvana.

The real problem facing PC players interested in HDR is that the technology doesn't exist on the platform yet. Nvidia is actively working on a patch for Rise of the Tomb Raider, and games that will support HDR on consoles will only likely do so on PC (we are currently talking about Forza, Battlefield, Gears of War and others). At the time of writing there is no exact data.

Until patches and games appear, HDR will remain the prerogative of content for Blu-ray and streaming viewing, along with console hits, whose users will fully enjoy the new technologies. While you can relax It will still be some time before HDR reaches PCs, so sit back and enjoy your Steam catalog and look forward to the visual feast to come.

Everyone's been talking about HDR lately, and if you pay attention to video settings in computer games, you're probably wondering what's so special about it? Many years have passed since its appearance! Well, I need to tell you something. Remember those checkboxes next to “HDR” that you saw in games after Half-Life 2? Turns out it's not 100% HDR.

"HDR" option in mobile phone camera app? And this is also not entirely true, although, as in the case of false HDR checkboxes in game settings,the goal of image enhancement remains the same.

Confused? Dont be upset. When it comes to misconceptions about HDR, this is just the beginning. So what is HDR, and how can it change PC gaming? The answer to the first question will not cause difficulties. However, the second one will take some time to reveal. Read on to understand what HDR is all about and which monitors and graphics cards support it.

What is HDR?

HDR (literally translated from English as “high dynamic range”) is an umbrella term for technologies that expand the color and contrast spectrum of video displays far beyond the capabilities of the latest hardware. Despite what you may have heard, resolution is not the most important component when it comes to image quality. Beyond the large-scale limitations of the current Windows interface At 110 DPI, the number of pixels begins to matter much less than the manipulations that are carried out with them.

Brightness, contrast and color saturation all become paramount to image quality once the required resolution is achieved. And HDR improves these indicators. This is not just another expansion update. HDR requires specific physical characteristics of the computer; it doesn't take a trained eye to notice changes in the image.

HDR requires LCD screens to achieve at least 1000 cd/m2 or nits of brightness to meet the new "Ultra HD Premium" standard. High-quality ones, which have a maximum of about 300-400 nits of brightness, are not even close. Quality laptops aren't much better, as they only have 100 nits or so more brightness. Even Cell phones with their sci-fi screen technology that allows you to see the picture clearly in bright sunlight, only have about 800 nits. When it comes to brightness, HDR-compatible screens give their predecessors no chance.

Colors are also transformed, thanks to HDR characteristics that require 10 or 12 bits for each color channel, which is available in operating system and is managed using a set of active tools. Most displays only provide 6 or 8 bits per color channel, using a color space subsystem called sRGB, which covers only a third of the HDR visual spectrum. In fact, we see colors much poorer than they really are. However, even when the hardware is capable of more, software quirks make using the enhanced color modes inconvenient.


left - HDR spectrum, right - sRGB spectrum

Nowadays, PC monitors that support wide color gamut, or WGC, are typically used in certain applications, such as image editing or medical applications. Games and other software simply don't notice the extra colors and often appear poorly configured when the compressed color space they use is displayed on wide color gamut displays, unless the hardware requires extra operations to simulate the compressed color space. HDR standards prevent confusion by including metadata in the video stream. This helps you manage color space correctly, ensure applications look right, and take optimal advantage of enhanced display capabilities.

To handle the additional HDR data, HDMI 2.0a is also required, a long-overdue upgrade from the ubiquitous low-bandwidth HDMI 1.4 standard.

Existing problems with HDR technology

HDR promises a lot, but its future path is not yet clear. The biggest problems arise not from technical obstacles, but from competing, partially incompatible standards.

There are currently 2 main HDR standards. The first is patented Dolby Vision technology with 12-bit color depth and dynamic metadata. Its rival is the open HDR-10 standard, which supports 10-bit color and provides only static metadata at the beginning of the video stream.

Dolby Vision, which requires licensing fees and additional hardware, is a more expensive innovation, which has limited its adoption. Many screen manufacturers and content providers have decided to support HDR-10, including Microsoft for the Xbox One S and Sony for the PS4, which indicates the direction the gaming industry is leaning. Thanks to this, HDR-10 is considered optimal choice for console lovers.

Dolby Vision fans praise its color depth, higher standards hardware and frame-by-frame ability to dynamically adjust color and brightness, along with HDR-10 compatibility. But the market is moving towards the less expensive and fairly good HDR-10 standard.

Since HDR-10 can't be combined with Dolby Vision, Dolby wins in many ways, but this proprietary and expensive HDR standard will likely be stymied in the coming years. However, if you're an avid video or movie watcher who buys based on quality and content, which is the case with Dolby's latest products, price may not be that important and you'll have no problem enjoying a fuller picture. Gamers, however, will likely have other priorities.

Where is HDR used today?

PC makers are rushing to join the HDR race, but you don't have to wait for them to try it out. HDR hardware is already available to consumers in the high-end TV market. Moreover, this best opportunity see HDR in action, as there is already a small but constantly updated library of HDR demos, movies and shows comparing different variants"side by side", which clearly shows the difference in the image. All of our recommended TV models support HDR, making them a worthy monitor replacement option if image size and quality are more important to you than fast response time.

TVs Samsung series The KS8000 and KS9800 can handle a variety of tasks as well as good PC displays, allowing them to be among the first to embrace HDR content as it becomes available on PCs. Just remember that all this costs a lot, and there are some disadvantages to using a TV for gaming.

HDR Compatible Graphics Cards

The only place where computers are already prepared for HDR is the graphics card market. And while monitors lag behind their TV counterparts, GPUs mid- and high-end devices have been ready for change for almost a year now, thanks to healthy competition from Nvidia and AMD.

Nvidia has been talking about HDR since the introduction of the 900 series cards, and is now certifying all Pascal-based cards and making them HDR ready. AMD is slightly behind, releasing the 390X and Polaris, its first HDR-compatible cards. If you bought a video card recently, chances are that at least part of the system is already ready for changes.

When will HDR appear in computer games?

Most complex issue— what does all this mean for gamers? While some new products will be released with HDR support, older games without some fixes will not support wider color and contrast capabilities. These older games may play fine on HDR-equipped systems, but you won't see any improvement without adding fresh code.

Fortunately, using advanced technology does not require a complete rewrite of the software. Without significant effort, a fairly simple mapping process can be used that extends SDR color maps into HDR ranges.

This means that in the future, popular games may be corrected by adding HDR support, or remixed, and dudes will return to the classics, which the developers will not even look at. Unlike simulated or averaged HDR, which was used in earlier versions games or photography, HDR hardware is already gaining popularity in the entertainment industry. However, we still need to strive for the ideal.

The real problem facing PC gamers today who are interested in HDR is that the platform does not have such technologies. Nvidia is developing a patch for Rise of Raider Tomb. In addition, games that support HDR on consoles may have versions that are suitable for PC (for example, Forza, Battlefield, Gears of War and others). But we cannot know about this for sure.


left - HDR mode, right - standard

Until such patches and games appear, Blu-Ray and console users will be the first to experience the enhanced visuals. So you can relax. It will be a while before HDR arrives on PC, so for now just enjoy your Steam catalog and wait.

No similar articles

When CES took place in early 2017, it became clear that computer monitors that support the HDR standard would soon begin to fill computer store shelves. All large manufacturers Such models are already being sold – each with impressive parameters. We'll talk about one of them in detail soon, but for now let's focus on the theory that will help you decide whether buying an HDR-capable monitor is worth it right now.

HDR in PC format

The standard explanation describes HDR (or High Dynamic Range) as a set of standards designed to expand the color and contrast of videos and images beyond standard, "hard" hardware capabilities. Simply put, HDR improves contrast, brightness and color saturation, delivering images that are several times more detailed. HDR vs SDR

In practical terms, for most users this means completely replacing their existing devices to make a clear difference in image quality. Why a complete replacement? Because standard devices, particularly monitors, do not meet the requirements for HDR certification.

Let's start with the brightness requirements. To be considered "HDR ready" a display must have a minimum of 1000 cd/m2 (nits of brightness). High-end monitors provide brightness between 300-400 nits, i.e. they are not even close to what is needed. Good laptops have around 100 nits. Even smartphone displays designed for good visibility in bright sunlight rarely exceed 800 nits (the Galaxy Note8 is one of the exceptions, with a brightness of 1,200 nits). In other words, 99% of displays currently don't support HDR.

Now let's move on to color reproduction. HDR technology requires the monitor to support 10-bit or 12-bit color depth. However, standard monitors are only capable of 6- or 8-bit color using the sRGB color gamut, which only covers a third of the HDR visual spectrum.

Monitor models that support Wide Gamut Color (WGC) technology meet color requirements, but their advanced capabilities are only compatible with software programs professional use (graphic editor, For example). Games and other software simply ignore the extra colors and often appear "washed out" if the hardware can't emulate the reduced color space.

HDR avoids this confusion by providing metadata that correctly distributes the color space. This is what helps render the image correctly and forces all software to make optimal use of the display's capabilities.

Here, however, there is one big “BUT” to be inserted for those of you who work in the field of photography, graphic design and video processing. The brighter, more saturated colors provided by HDR monitors may not be to your liking. Not because you won’t like them, they just won’t satisfy your professional needs, because their “liveness” is achieved at the expense of realistic color rendering. Models with WGC continue to be your ideal choice. So if you're reading this to find out what benefits this technology brings to your employment sector, you simply won't find them.
Two Dell designer monitors. On the left is a WGC screen with realistic color reproduction. On the right is the HDR display. It is easy to notice the high color saturation.

Porridge from standards

Next we'll talk about the experience from the perspective of the average user and PC gamer, but first let me untangle the huge tangle of HDR standards for you. There are currently four standards, but only two of them are widely used in consumer electronics: the patented Dolby Vision with its 12-bit color and dynamic metadata; and the open standard HDR10, which supports 10-bit color and provides only static metadata transfer. The other two standards are HLG, developed by the BBC and used by YouTube; and Advanced HDR, created by Technicolor and used primarily in Europe.
Difference between SDR, HDR with static metadata (HDR10) and HDR with dynamic metadata (Dolby Vision).

But let's return to the issue of using HDR in computer monitors, adding the severity of games. Requiring a license fee and additional hardware, Dolby Vision is the more expensive of both standards, and its cost is a major factor in slow reception. Even though Dolby Vision provides better color depth and dynamic frame-by-frame capabilities, game developers choose the cheaper but optimal HDR10. In this case, we are talking not only about PC manufacturers, but also about consoles: Microsoft (Xbox One S and Xbox One X) and Sony (PS4 and PS4 Pro). Major proponents of HDR10 such as Samsung and Amazon are even actively fighting the argument that Dolby Vision provides more high quality Images. This struggle has led to an update of sorts called HDR10+, which improves on some of HDR10's weaknesses.

All this suggests that HDR10 will become the widespread HDR standard for computer monitors and games, right? No, not at all. Recently, Dolby Vision developers have made it easier to integrate their technology into games and GPUs through patches, firmware or driver updates. This spring, NVIDIA joined the ranks of key supporters of Dolby Vision in games.
NVIDIA booth at Computex 2017. On the left is a standard SDR monitor, on the right is an HRD monitor. Photo: TechPowerUp

(PC)Gaming in HDR

Console players are luckier when it comes to HDR. They benefited from the inclusion of the standard in high-end TVs, and console manufacturers and game developers (specifically for consoles) quickly saw the visual advantage of HDR screens over standard TVs. From a purely practical standpoint, it's easier for a consumer to justify investing more in a screen that serves as the entertainment center of their home than one that sits on their desk.

However, PC gamers can thank their console comrades. The popularization of HDR in TVs like the LG C6 and C7 series, which double as giant PC monitors, has allowed PC geeks to enjoy the first wave of HDR content made specifically for the PC.

But still, what monitor models should you pay attention to? Three of the most promising HDR monitors announced quickly disappointed by not actually meeting all of the HDR10 requirements. And therefore do not support true HDR. Two of them, the Dell S2718D and LG 32UD99, can accept an HDR signal, but do not have the necessary color range or brightness to take advantage of HDR content. The latest, the BenQ SW320, meets the color but not the brightness requirements. Thus, the following models remain on the list: Acer Predator X27, Acer Predator X35 ASUS ROG Swift PG27UQ, ASUS ROG Swift PG35VQ, Samsung CHG70 and Samsung CHG90.
ASUS ROG Swift PG35VQ is one of the most promising HDR models on the market this moment

The next logical question is: what is the situation with the GPU? In this regard, computers have long been prepared thanks to the war between NVIDIA and AMD, as well as their mid- and high-end video cards.

NVIDIA started by integrating HDR into its Maxwell generation GPUs (previous 900 series) and continued the certification with the new 1000 series, which uses the Pascal architecture. The first certified video cards from AMD were models of the 390X and Polaris families. Simply put, if your graphics card was manufactured within the last 4 years, you shouldn't have any problems. However, if you want to take advantage of everything the new HDR display has to offer, you'll have to invest in one of the most the latest models video cards

The real problem with HDR for PC gamers

If your money is in perfect order, then buying a monitor with HDR support and an appropriate computer hardware won't be a problem. But before you run to the store, you should study the situation with the availability of relevant content. Unfortunately, the situation in this regard is so-so. Yes, there are new games that initially support HDR, but older games do not know how to adapt to the features of this technology. At least not without special patches.

Integrating HDR does not require major changes to software, but this does not change the fact that currently the amount of HDR content available to PC gamers is not that great. In fact, only a few games support the standard: Shadow Warrior 2, Deus Ex: Mankind Devided, Hitman, Resident Evil 7, Obduction, Paragon, patched version of Mass Effect: Andromeda, Need For Speed: Payback and Star Wars: Battlefront 2. Cross-platform games Gears of War, Battlefield and Forza Horizon 3 support HDR on console versions, but this feature is not available on PC. Some time ago, NVIDIA was actively working on an HDR patch for Rise of Tomb Raider, but there has been no news from the company for a long time about how this work is progressing.

Game developers are embracing the idea of ​​HDR, but console games will be the first to support it. PC gamers remain (again) in the background. It will be a few more years before HDR becomes a truly important feature of computer monitors. At the moment, this standard is not among the mandatory parameters that must be possessed gaming monitor to be worthy of attention. As with 4K, HDR is an investment in the future.

One piece of advice I can give you in closing is to buy a monitor today that meets your current needs. If HDR is important to you, this nice bonus will cost you a few hundred extra dollars, but it will guarantee (albeit small) that your brand new monitor will remain relevant for a long time.

Tags: ,

Modern technologies They have become so firmly established in our lives that it is sometimes difficult to understand all their diversity. For example, when choosing a TV, you will have to pay attention to many parameters, some of which may puzzle an unprepared buyer. In this article we will talk about HDR technology, which has become extremely relevant in the last year and a half.

In practice, its implementation until the beginning of 2016 was the prerogative of individual TV models, and there was catastrophically little corresponding content. Fortunately, this began to change over the course of the year, with more and more manufacturers equipping their TVs with HDR support. Manufacturers of game consoles have also implemented support for the technology in updated models. Sony went even further and enabled HDR support on the original PlayStation 4 through a software update.

The most important thing is that suitable content has appeared and continues to appear, capable of being revealed on devices with HDR support.

So is all the fuss about HDR worth it for consumers to invest in compatible hardware?

Tomsguide.com

What does it look like?

Let's try to figure it out and start by explaining the essence of the technology.

Any TV is characterized by contrast and color accuracy. Contrast affects how bright and dark colors a device can display while still being visible to the viewer. Color accuracy, in turn, means how close to real colors will be displayed on the screen.

It is curious that most potential buyers, if offered a choice of a TV with a higher resolution and a TV with a lower resolution but higher contrast, will choose the second. It is the saturation and variety of colors that are the priority when choosing. Thus, the brightness of the picture remains preferable to resolutions above 4K. Buyers choose with their eyes.

What's the point?

HDR (High Dynamic Range) technology, or extended dynamic range, makes the above choice even more obvious: it makes light colors even lighter and dark colors darker. HDR increases color range and maximum contrast, making images appear deeper and richer. Standard colors - red, blue and green - receive additional shades and their combinations, which directly affects the image quality.

WCG (Wide Color Gamut) technology goes hand in hand with HDR. The latter further expands the available range of colors. Viewers who have never encountered these technologies before will be pleasantly surprised by how much the number of shades of the same, seemingly familiar colors increases.


wired.com

It is important to understand that the HDR technology being introduced into modern TVs and devices connected to them is seriously different from what has been present in our smartphone cameras for some time.

So, television technology HDR increases the contrast and range of available colors to make the image on the screen more realistic and show it in natural colors. HDR technology in cameras, in turn, is used to combine several images into one to obtain best image, which combines the most successful elements of all the shots taken. Thus, the difference between the two HDRs is fundamental.

How is it implemented?

HDR technology consists of two integral parts: the display and the content.

In fact, the TV is the simpler part of the two. It is required to be able to illuminate certain areas of the screen more brightly than its conventional counterpart does not have HDR support.

It's much more complicated with content, which must be HDR-capable in order to display high dynamic range on screen. Most films and many TV series made in the last decade support HDR. It can also be added without any artificial inclusions to the original picture. The fact is that the main obstacle to getting HDR content to your TV is purely data transfer.

Video created using high dynamic range is compressed so it can be streamed to your TV, computer, or other device. As a result, the user sees, at best, the picture that his display is trying to reproduce using the technologies and systems built into it to improve image quality.


whathifi.com

This way, only content from certain sources will be displayed in true HDR because your TV will receive additional metadata that tells it exactly how to display each specific scene. Of course, this assumes that the playback device supports the technology.

In addition, there are certain requirements for equipment. Not only your TV, but also your player or set-top box must have HDMI connector version no lower than 2.0. Most equipment released from 2015 to the present supports the HDMI 2.0 standard, which can be software upgraded to HDMI 2.0a. Latest version standard is necessary to transmit the same metadata mentioned above.

At the same time, manufacturers have already decided to assign a UHD Premium certificate to TVs that support 4K resolution and HDR technology. You should pay attention to its availability when purchasing. It's also worth noting that the 4K Blu-ray format also comes with HDR support by default.

Summarize

Of course, HDR technology in TVs is not yet as vital as manufacturers make it out to be, but it is now the main driving force in the industry. The race for resolutions above 4K has faded into the background, giving way to extended dynamic range.

Although the best results will be achieved by combining the two leading standards, at this stage it is preferable to choose a TV with HDR support unless you are willing to pay a premium for higher resolution than 4K. The quality of the picture when using suitable content will pleasantly surprise you in any case. You can't fool your eyes: brighter, more saturated colors, as well as their variety, will be preferable to having an ultra-high-resolution matrix.

Thus, when buying a new TV at the end of 2016, it is advisable to at least take care of the presence of HDR support, and resolutions above 4K still remain a pleasant addition, but still an addition that affects the final cost of the purchase too much, but does not bring the same emotions.