HDR technology landed on our TVs four years ago . Since then it has generated so much debate and is so present in the marketing of brands that it is likely that many users have the feeling that it takes more time in our lives . But no, the main manufacturers began to introduce it in their televisions in 2015.
The alliance that HDR technology and TVs have forged is very solid, but, in fact, this technique was born in the decade of the 30s of the last century by the American Charles Wales Wyckoff . This photochemist conceived it with the purpose of giving his photographs a dynamic range much higher than the usual shots of the time. Knowing their origin can help us to put this technology in perspective, and this is a good starting point to achieve the objective of this article: discover what this technique offers us today in our televisions and what are the keys that we want to know to not leave us intimidate for her.
What is HDR and what impact does it have on image quality?
I am aware that most of the people who are reading this text are familiar with the HDR technology and know with more or less accuracy what it is, but it does not hurt that we review it to strengthen the base of this article. The acronym HDR comes from the English name High Dynamic Range , which means high dynamic range , which allows us to intuit that we are talking about an innovation that seeks to offer us images with a very wide range of luminance. A simple, though not entirely rigorous, way to define luminance is to interpret it as the luminous intensity or amount of light that is capable of projecting a surface.
When we use the term “luminance range” in this context, what we are describing is nothing more than the ability of a television to deliver a set of levels of different luminous intensity. Or a scale of luminositywith a specific gradation. The dynamic range of a television will be greater as the distance separating the intensity of the darker areas of the images from the intensity of the more illuminated areas increases. This allows us to intuit that the minimum and maximum capacity of delivery of brightness that a television has is important in this context.
But not only the brightness that the TV is capable of offering is important, but also the number of brightness levels with different intensity that it can reproduce. The more levels with different brightness delivery capacity we have between the minimum and maximum levels, the better. A greater number of levels will contribute to the television being able to recover more information in the less illuminated areas as well as in high lights (the most illuminated areas).
The quality of the HDR offered by a television is conditioned by its ability to deliver brightness, its contrast ratio and the color space it is capable of reproducing
Once we have reached this point the equation is complicated because there are other technologies that have an important impact on the HDR quality, and that, above all, condition the contrast of the TV: the type of panel and the backlight scheme that uses provided it is an LCD panel. Each of the tiny organic cells of an OLED panel is capable of generating its own light, so these panels can turn on and off each of its pixels completely independently. This property allows them to deliver a very high native contrast and very deep blacks. And this high contrast has a positive impact in the HDR because very intense blacks accentuate our subjective perception of the intensity of the most illuminated areas of the images by direct comparison.
TVs that opt for an LCD panel, unlike OLEDs, require an additional light source because the panel itself is not capable of emitting it. This need can be solved by resorting to one of two strategies : peripheral LED backlighting and LED backlighting FALD ( Full Array Local Dimming ). In the first, the light source is placed in the margins of the panel, while the second is characterized by using a matrix of LED diodes placed just behind the panel. This last option allows to control with more precision the attenuation of the backlighting by zones, so it is the one that offers the best results when reaching a higher contrast and recovering more information in the darkest areas.
As we have seen, the design of the OLED panels allows us to deliver the highest native contrast and the deepest blacks, while the need for an additional light source that the LCD panels have allows us to deliver more light. Once we have reached this point it is reasonable that we ask ourselves which of these two technologies offers us the best experiencewhen playing HDR content. The answer is interesting: the result is conditioned by all the technologies that have a direct impact on the image quality of the TV, and not only by the type of panel used.
A model with an OLED panel can be less bright than a comparable LCD LED TV, and still offer a spectacular HDR because of the combination of its high native contrast, its ability to recover information in dark areas and the high lights, and their light delivery capacity make it possible. And, at the same time, a LCD TV with backlighting FALD can also offer us a super HDR , despite having a native contrast much lower than an OLED model, because its high capacity of brightness can compensate to some extent the absence of blacks as deep as those of OLED technology.
UHD Alliance has defined two different HDR standards: one for LED LCD TVs and the other for OLED models
In our perception of the dynamic range, the intensity levels of the darker areas are as important as those of the most illuminated areas because it is constructed from the relationship between one and the other . In fact, the distance separating the two limits and the number of levels between them are more important, accepting that one limit is the minimum level of luminous intensity possible and the other the maximum level of brightness, than its absolute value. We can imagine this idea as a recipe whose flavor benefits more from the balance that exists between all the ingredients than from the predominance of one of them.
Companies and organizations involved in the definition of HDR standards are aware of this “recipe”, which has led to UHD Alliance, an association in which practically all brands of consumer electronics, film producers and studios are represented. television, has decided to define two different standards : one for LCD TVs and another for OLED models.
The first, the LCD, will be compatible with HDR if they are capable of delivering at least 1000 nits brightness peaks and a minimum luminous intensity level equal to or less than 0.05 nits . OLED TVs, on the other hand, may play HDR content if they manage to deliver brightness peaks of at least 540 nits and their minimum luminous intensity level is equal to or less than 0.0005 nits . As you can see, these standards are adapted to the characteristics of the LCD and OLED panels, but these are not the only conditions that a TV must have to play HDR content.
The depth of color also matters, and much
HDR technology came to the world of television sets with the ambition to offer us images more similar to those we can see in the real world, and this goal is only possible if, in addition to the ability to deliver brightness and the contrast ratio, we attack other fronts. The one that is clearly defined in the HDR formats that compete for more strength in the market is the depth of color , which tells us how many tones a TV can play. As we can sense a greater capacity for color reproduction allows a television to offer us images more similar to those we can see in the real world. And this is the goal.
In the next section of the article we will investigate the “rules” that stipulate the current HDR standards, but at this moment it is good to know that the HDR10 specification requires a color depth of 10 bits (although there are tricks that allow us to circumvent this requirement, such as see below), and Dolby Vision works with 10 bits, but ideally proposes a color depth of 12 bits. The televisions that we can currently find in the market incorporate panels of 8 or 10 bits , so the 12 bits remain as an ambition that at the moment we can only touch with the tips of our fingers.
An 8-bit panel can reproduce 256 different tones (2 8 ) of each of the basic colors that make up the images (red, green and blue). This means that the TVs that use them manage to reproduce a space of up to 16.7 million different colors(256 3 ). This color range is not bad, but it pales if we compare it with the color space that the 10-bit panels are capable of reproducing. If we make the same calculations with these last panels we will check that they are capable of representing each basic color with a range of 1,024 different shades (2 10 ).
10-bit panels can reproduce a space of more than 1.073 million different colors
As a result, panels with a color depth of 10 bits manage to reproduce a space of just over 1.073 million different shades (1,024 3 ). As you can see, the difference between them is enormous, which means that the realism of the 10-bit panel images and their chromatic precision are significantly greater than those offered by the 8-bit panels.
However, there is a technique that can to some extent help 8-bit panels to offer a bit more accurate color reproduction. It is a processing algorithm that uses FRC ( Frame Rate Control ) techniques to generate a wider color space using adjacent pixels of different colors that give us the feeling that we are seeing a third color that, in reality, does not form part of the original color space of the 8-bit panel. Several weeks ago we had the opportunity to analyze Hisense’s ULED H65U9A TV , which has an 8-bit VA panel and uses this technique, and its colorimetry during our tests was convincing, but it fails to match the performance in this area of a panel of 10 bits
Interestingly, there are also televisions equipped with native 10-bit panels that carry out 12-bit color processing and management , which provides a plus of quality that is also worth taking into account. However, this technique is usually only present in the models of the high and premiumranges , so its price is almost always higher than that of televisions that incorporate a normal 10-bit panel and lack this processing.
These are the HDR standards currently consolidated
Before we investigate the characteristics of the HDR formats that we can find today in the market it is important that we bear in mind that this technology does not only depend on the display device, regardless of whether it is a television, a projector or a monitor. The first link in the reproduction chain is the content , and if this does not contemplate this technique and does not respect what is stipulated by any of the standards that we are going to see next, there is nothing to do.
Fortunately, some on-demand video services , such as Netflix, Amazon Prime Video or Rakuten TV, have a broader content catalog with HDR . We also have interesting options, although it is a less popular format, if we stick to the Blu-ray 4K movies , which also tend to bet on the HDR. In any case, the supply does not stop increasing and it is probable that in the medium term practically all the films and series of certain entity incorporate this technology.
This is undoubtedly the most widespread standard for two fundamental reasons: its requirements are the least demanding and it is open, so any manufacturer of consumer electronics devices can use it without paying license fees. For a TV to be compatible with HDR10, it must have a panel with a color depth of 10 bits and it must be capable of delivering at least 1,000 brightness peaks . An interesting point: these brightness peaks are measured in some areas of the screen and not in the whole panel simultaneously, which makes it easier for brands to reach this mark.
An important lack of this standard that, as we will see next, solve other options, is that it does not have dynamic metadata . For this reason, the television receives precise instructions on how the lighting should be treated when content starts playing, and these conditions do not vary throughout the reproduction. In any case, all TVs that are Ultra HD Premium certified are, at least, compatible with HDR10. This standard was defined by the UHD Alliance consortium to establish common requirements that all the latest generation television sets that aspire to wear this logo must meet.
HDR10 requires a 10-bit color depth and a maximum brightness delivery capacity of at least 1,000 nits
An Ultra HD Premium TV , whatever its brand, must have a 10-bit panel, at least 4K UHD resolution (3,840 x 2,160 points), it must be able to reproduce at least 90% of the DCI color space. P3 and has to meet the conditions of brightness delivery that we review a few paragraphs above: OLED models must deliver a minimum brightness not exceeding 0.0005 nits and a maximum brightness of at least 540 nits, while the LED LCDs move between 0.05 nits and 1,000 nits or more.
In the market we can find televisions that do not show the Ultra HD Premium logo, and, therefore, do not usually meet at least one of the requirements of this certification, but that, curiously, do boast of being able to play HDR content. Normally they are not Premium UHD because they use an 8-bit panel or fail to deliver brightness peaks of at least 1,000 nits. Put all these models in the same bag can cause us to commit any injustice, but the reasonable thing is to be wary if we find any of them because normally the experience they offer us with HDR content is not up to what you expect from this technology .
This standard is an extension of HDR10 and, as such, requires that TVs that aspire to incorporate it use a 10-bit panel and be capable of delivering brightness peaks of at least 1,000 nits . Initially it was supported by Samsung, Amazon and Panasonic, but more and more companies are betting on it because, like HDR10, it is a free standard and does not require paying license fees.
The interesting thing is that there is a very important difference between HDR10 and HDR10 + that allows the latter specification to offer us an experience with the most successful HDR content: it can work with dynamic metadata . This information is associated with the content that we are reproducing and indicates precisely how the television should treat the lighting. The difference between HDR10 metadata and HDR10 + metadata is that the HDR10 + metadata is dynamic, and, therefore, tells the TV how it should illuminate each sequence, rather than just at the beginning of playback and without the possibility of acting on it. Subsequent This improvement places HDR10 + a bit closer to Dolby Vision, a proposal that, as we will see below, is more ambitious on other fronts
From a strictly technical point of view, the standard proposed by Dolby is the most ambitious. And is that it has been designed to work with panels with a color depth of 12 bits and is able to manage deliveries of brightness of up to 10,000 nits . Currently no TV offers these features (12-bit panels or the maximum brightness of 10,000 nits), so the models that are already compatible with this technology are made up of 10-bit panels and more modest brightness deliveries. This situation has something important: we still can not get the full potential of Dolby Vision.
In addition, this standard uses dynamic metadata. In fact, HDR10 + was inspired in some way in this feature of Dolby Vision to reduce the distance that separated HDR10 from Dolby’s proposal. As you can see, everything looks great here, but now comes the main weakness of this specification: this technology belongs to Dolby, and, for this reason, manufacturers of televisions and other devices that want to use it must pay the relevant license feesto the San Francisco company and introduce a specific processing chip in their products. The impact of the implementation of this certification on the price of televisions has meant that only a few brands bet on Dolby Vision, such as LG, Panasonic or Sony, and, moreover, only on their TVs of the high and premium ranges .
HLG (Hybrid Log-Gamma)
This HDR format is not an alternative to HDR10 + or Dolby Vision. In reality, it is a complement that aims to reach where these last two standards have not “attacked”: to television broadcasts , whether live or delayed. In fact, its creators and main promoters are BBC and NHK, which are the public television and radio services of the United Kingdom and Japan respectively. HLG has been very well received by the manufacturers of consumer electronics since the beginning because it is a free format, which has caused almost all television manufacturers to introduce it in their proposals.
This standard has a great advantage: it can coexist without any problem with the television park that, due to its age, is not compatible with HDR. If a model receives a signal with HLG and is not able to interpret it, it will simply reproduce the content with total normality. Without HDR. But when a TV with HLG receives a signal that incorporates this technology, it interprets it correctly and shows the content with HDR .
Of course, it is important that we take into account that this standard does not use metadata, so its impact on our experience is less than that of HDR10 + and Dolby Vision. In addition, this technology is still being deployed and has only been used in the transmission of some sporting events, so it does not seem that we will be able to enjoy it massively in the short term.
Technicolor HDR or Advanced HDR
This standard has so far received a very timid reception from the brands of consumer electronics (the one that is defending it more firmly is LG), so it is lagging behind HDR10 +, Dolby Vision and HLG. Behind him is Technicolor , a French company very well established in the film industry practically since its inception. Even so, although it has been developed by a company that, of course, has business interests, it is an open standard , unlike Dolby Vision.
Advanced HDR offers solutions for two different use scenarios: the transmission of television broadcasts with HDR through DVB-T / T2 or satellite and the conversion of content lacking HDR to equip them with this technology. His way of proceeding in the first of these scenarios is to encode the content during HDR recording and a 10-bit color depth using the HEVC video compression standard . Next, the encoder processes the video with HDR to generate, on the one hand, a video frame without HDR and with an 8-bit color depth, and, on the other hand, the metadata necessary to recover the information associated with the HDR.
This strategy has two advantages that can make a difference in television broadcasts: on the one hand, it saves bandwidth when broadcasting content in both HD and 4K UHD, and, in addition, allows content to be reproduced correctly by television sets that they lack HDR, which will simply ignore the metadata. On the other hand, Technicolor has developed an algorithm that, according to this company, allows to carry out an «intelligent color management» able to analyze in real time the content without HDR with the purpose of adjusting the luminance of the dark regions, the tones media and the most illuminated areas to provide the HDR content . On paper it does not look bad, but we can not draw conclusions without knowing what real impact this technology has on image quality.
And the options that generate confusion: Q HDR 1000, HDR Converter and others
The five HDR formats that we have reviewed so far are those that have more support, have been set as standards and have a clear vocation for multi-brand adoption, even though they have been developed by a single company. However, it is enough to take a walk through any trade that sells TVs or inquire on the website of major manufacturers to realize that brands use many other namesto describe the HDR capabilities of your TVs. Samsung, for example, uses the logos Q HDR 1000, Q HDR 1500, Q HDR 2000, etc. In fact, its more advanced models are compatible with HDR10 + and the number that pursues the designation “Q HDR” indicates the maximum capacity of brightness delivery of each model. It simply reflects the maximum nits. It is evident that these logos are nothing more than a marketing strategy .
All manufacturers use their own names to describe the HDR capabilities of their TVs that can cause confusion among users
All brands perform movements similar to this one. LG, for example, implements HDR Converter technology on its OLED TVs, which is nothing more than an algorithm that adjusts the color, brightness and contrast in real time to emulate the HDR in the contents that lack this feature. It is not a bad idea, but it is important that users are able to distinguish the authentic HDR standards, which propose a content coding and reproduction strategy that has a clear impact on the image quality and can be adopted by any brand, paying or not for the use of the format, proprietary technologies that brands develop exclusively for themselves and of the denominations that only seek to attract the attention of potential buyers and do not have the support of other brands.
Who is who in the battle of the HDR
The HDR standard that currently has the most support is, without a doubt, HDR10 . As we have seen, it is a free standard, and, as such, it has been blessed by virtually all consumer electronics brands, by content creators and by the main video on demand platforms. The reception of HDR10 + is more timid, which has caused that, for now, support Samsung, 20th Century Fox, Amazon (through its Prime Video platform) and Panasonic, although it is possible that little by little they also support other brands because its ability to handle dynamic metadata places it as the most consistent alternative to Dolby Vision.
This last proposal, that of Dolby , is the most ambitious if we stick to the image quality it is capable of offering us. It has a relatively important support, but clearly inferior to the one that has received HDR10, something understandable if we consider that it is a proprietary standard. Some of the companies that support it are LG, Sony, Panasonic, Amazon (through its Prime Video platform), Netflix, Rakuten TV, Apple (through the iTunes movie service), VUDU, Google (through Chromecast Ultra ), Hisense, TCL or Vizio.
The HLG standard also has a relatively important backing. It is defended by Sony, LG, Samsung, Philips, Loewe, JVC, Panasonic, Apple, Hisense or Toshiba, among other brands. And probably in the future it will be supported by more companies because it is a non-proprietary standard relatively easy to implement on any device that is already compatible with HDR10. The inflection point will occur, possibly, when the deployment of this technology is completed and the television channels begin to broadcast content with HLG. At that time it is reasonable to expect that you will receive even more support.
Finally, the Technicolor proposal has, for the moment, a rather timid support if we stick to the consumer market. In the film industry it does have a strong backing and we can find it in many of the films that are being shown in theaters. But if we stay with the consumer electronics market, Advanced HDR is currently supported by LG, Philips and Funai. It is possible that some Chinese television manufacturers, such as Hisense or TCL, get on the bandwagon because several Chinese television stations have confirmed that they will use this format in the future , but their current situation seems to indicate that they still have a long way to go.
HDR and videogames
Video games benefit from both HDR technology and movies. And its impact on our experience is just as strong as long as all the links in the chain reproduce it correctly. We can currently enjoy a game with HDR using a PC equipped with a NVIDIA GeForce GTX 10 graphics card or later, or AMD with Polaris GPU or later, as well as an HDR compatible monitor.
If we stick to consoles, PlayStation 4, PlayStation 4 Pro, Xbox One S and Xbox One X can process games with HDR . Also, of course, we need to connect them to a HDR television. But what kind of HDR? Simply, HDR10. It is the standard most used in the world of video games, although Dolby Vision also has some support, minimum, in Windows and Xbox. However, the most popular Dolby technology in the gaming industry is Atmos . One last important note: as with movies, the HDR must be implemented in games . Otherwise, even if our console and our television incorporate this technology, we can not enjoy it.
So we can find out what HDR has our new TV
Confirming whether a television implements any of the HDR standards discussed in this article should not require any effort on the part of the user. This technology is important enough for brands to clearly reflect both the packaging of their televisions and advertising. And so they usually do it. This information is also reported almost always in the specifications that can be found in the manual of the TV and on the website of the brand.
Even so, I suggest that we imagine that we have a relatively modern television and we do not keep its instructions or its packaging. To curl the curl even more, let’s accept that we do not locate its specifications on the manufacturer’s website either. This situation is unlikely, but we still have room for maneuver. The easiest way to know if the TV is compatible with HDR technology or not is to resort to a video service on demand that we know with complete security that has HDR content.
We assume that we are dealing with a relatively modern television that has some Smart TV platform pre-installed. Otherwise, we can settle the matter at a stroke because almost certainly will not have HDR . Once we have assumed this starting point we can assume that it will have pre-installed, at least, the Netflix app, which is possibly one of the most popular currently.
This platform has a significant amount of content compatible with HDR10 and Dolby Vision , so we just have to look for any movie that we know with certainty that has HDR, such as ‘Annihilation’ , or the series ‘Mindhunter’ , among many other options. If our TV is compatible with HDR, the Netflix app will reflect it clearly on the film tab. This same operation can also be carried out with Amazon Prime Video, Rakuten TV or other video on demand platforms. If after doing all this we confirm that, indeed, our TV has HDR, we just have to worry about getting comfortable and find the content that allows us to fully enjoy this technology.