Jump to content

Talk:High-dynamic-range television

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 109.252.90.92 (talk) at 12:27, 2 May 2021 (→‎HDR and WCG). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This article uses American English dialect and spelling.
According to the WP:ENGVAR, this should not be changed without broad consensus.

Nothing about HDR-capable TVs and sources of HDR video for them.

I came to this article because I was curious about the utility of buying an HDR-capable TV. However, I find no helpful information. It is not even clear if I need to worry about which standard the TV supports. It is not clear what my options would be for sources of HDR video to exploit the capability of such a set. DrHow (talk) 23:00, 19 November 2016 (UTC)[reply]

Update: After posting the above criticism of the article, I came across the following article: http://arstechnica.com/gadgets/2016/12/high-dynamic-range-explained-theres-a-reason-to-finally-get-a-new-tv/ It does a very good job of answering the questions I had originally come to Wikipedia to resolve. I suspect that it would be appropriate to include a link to the Ars Technica article in the Wikipedia article; but I am not sure about the criteria which justify such a link. DrHow (talk) 18:57, 14 December 2016 (UTC)[reply]

Opening paragraph needs a lot of work

This is the current opening paragraph:

High-dynamic-range video (HDR video) describes high dynamic range (HDR) video that is greater than standard dynamic range (SDR) video which uses a conventional gamma curve. SDR video, when using a conventional gamma curve and a bit depth of 8-bits per sample, has a dynamic range of about 6 stops (64:1). When HDR content is displayed on a 2,000 cd/m2 display with a bit depth of 10-bits per sample it has a dynamic range of 200,000:1 or 17.6 stops. You will not find this on most displays.

It's pretty terrible. Let's start with the first sentence:

High-dynamic-range video (HDR video) describes high dynamic range (HDR) video...

So, HDR Video describes HDR video?

... that is greater than...

What does it mean to be "greater"? Greater in what measurement?

... standard dynamic range (SDR) video which uses a conventional gamma curve. SDR video, when using a conventional gamma curve...

One sentence implies that using a conventional gamma curve is part of what defines SDR video, then the next seems to indicate that SDR video could be used with a conventional gamma curve. There's also no reason to repeat that SDR video uses a conventional gamma curve so close together.

... and a bit depth of 8-bits per sample, has a dynamic range of about 6 stops (64:1). When HDR content is displayed on a 2,000 cd/m2 display with a bit depth of 10-bits per sample it has a dynamic range of 200,000:1 or 17.6 stops.

Why are so many dense specs/units/measurements just dropped in like this? They doesn't really add to understanding. Why does the HDR spec mention a certain required cd/m2 display, but the SDR spec doesn't? And after reiterating that part of what sets SDR video apart is its use of a "conventional gamma curve", why isn't the type of gamma curve HDR video uses given?

You will not find this on most displays.

Is this still true? When is it true? Does it need to be said in the intro paragraph? It seems like it would be best to just explain what HDR video is in the intro paragraph and keep further information about how it's being used in the rest of the content. Onlynone (talk) 13:54, 7 September 2017 (UTC) The whole article is pretty terrible IMO. It is way too technical for Wikipedia's lay audience of non-engineers. — Preceding unsigned comment added by Chris319 (talkcontribs) 00:45, 13 May 2018 (UTC)[reply]

No mention of VESA DisplayHDR Specifications?

https://displayhdr.org/performance-criteria/ --195.137.93.171 (talk) 20:26, 8 July 2018 (UTC)[reply]

It has been added by someone. Ff4ever (talk) 12:52, 25 April 2021 (UTC)[reply]

SDR contrast ratio

The source of the 1:64 contrast ratio claim seems to suggest that that this is the highest one can get if the video is to be wholly free of banding. Considering that few prioritize perceptually perfect precision over dynamic range, and that it is still difficult for anyone to detect the lack of precision in conventional SDR video, this point seems rather moot. Logically, with the ITU-R standards, digital video has a contrast ratio of 1:990 (1/220/4.5), excluding black. This needs some clarification. 80.162.33.59 (talk) 19:59, 19 September 2018 (UTC)[reply]

The opening paragraph says it all.. or basically nothing

[1]

greater range of what exactly? Different Colours? Contrast? Brightness? Very poorly written article. I'd like to add to the other comments, that a very basic explanation of what HDR actually does should be added. — Preceding unsigned comment added by Quaelgeist (talkcontribs) 17:21, 3 October 2018 (UTC)[reply]

References

  1. ^ High-dynamic-range video (HDR video) describes video having a dynamic range greater than that of standard-dynamic-range video (SDR video), which uses a conventional gamma curve.[1] SDR video, when using a conventional gamma curve and a bit depth of 8-bits per sample, has a dynamic range of about 6 stops (26=64:1).[1] When HDR content is displayed on a 2,000 cd/m2 display with a bit depth of 10-bits per sample it has a dynamic range of 200,000:1 or 17.6 stops,[1] a range not offered by the majority of current displays.[1]

Please substantiate the claim that HDR10+ only supports 1000/4000 nits?

This article repeats claims (taken from a Dolby source) that Dolby Vision is superior to HDR10+ because the former supports 10k while the latter allegedly only supports 4k nits. Can we either get a primary source for this that is not "a claim made by Dolby or an article repeating a claim made by Dolby", or remove it? I could not find any such reference in the standards HDR10+ is based on (e.g. SMPTE ST2094-40), which in fact mention that the coded metadata's mastering display brightness also has an upper bound of 10k nits. (And, of course, the PQ function HDR10+ is based on supports the exact same range as Dolby Vision, so arguing this sort of doesn't make sense to begin with).

If I do not receive an answer to this question, and after some further research, I would edit the article to remove this passage in order to avoid misrepresenting HDR10+ as a result of Dolby marketing (which is known to consist mostly of blatant lies for pretty much as long as Dolby has been around). 85.216.96.210 (talk) 21:58, 7 November 2018 (UTC)[reply]
I initially added some citation-needed tags on this claim, but after seeing this discussion I've decided to WP:BOLD and remove it. --Rcombs (talk) 03:16, 18 December 2018 (UTC)[reply]
All HDR formats that use PQ support 10,000 nits. This include HDR10, HDR10+ and most Dolby Vision profiles. Content could however be graded on a more limited mastering display. Today, Dolby Vision require at least 1000 nits for the mastering display. Content are commonly graded at 4000 nits. There are no minimal requirement for HDR10, HDR10+ ans HLG10. Today content are commonly mastered from 1000 to 4000 nits. See the new comparison table and this source: https://www.rtings.com/tv/learn/hdr10-vs-dolby-vision - Ff4ever (talk) 12:48, 25 April 2021 (UTC)[reply]

HDR does not mean deeper blacks

The first paragraph statement that HDR means deeper blacks is false. The only limiting factor as to how dark the blacks are is the minimum black level of the display itself. HDR only increases the maximum nit value contained in the image encoding it does nothing to lower the intrinsic black levels of the source content. See the following article for a technical description of HDR and a lot of the misconceptions being spread about it (including by wikipedia it would seem): https://www.lightillusion.com/uhdtv.html 94.175.102.211 (talk) 17:41, 21 July 2019 (UTC)[reply]

Since this article is not only about content/encoding, I've edited the opening paragraph to be more specific. --Ajul1987 (talk) 18:26, 6 August 2019 (UTC)[reply]

Mmm. No. The standard sRGB, in violet, shows that when light intensity reaches less than 0.1 nit or greater than 100 nit, there is no more differentiation in the color value. While with HDR (ST.2084 PQ) https://docs.microsoft.com/en-us/windows/win32/direct3ddxgi/images/hdr-bright-and-dark.png it is 0.001 to 10000. https://docs.microsoft.com/en-us/windows/win32/direct3ddxgi/high-dynamic-range-and-wide-color-gamut — Preceding unsigned comment added by ZBalling (talkcontribs) 01:11, 9 November 2019 (UTC)[reply]
Cite for "deeper blacks" added ZBalling (talk) 01:22, 9 November 2019 (UTC)[reply]
I don't think that particular graph is an accurate literal representation. For example, if you read the paragraph above the graph:
"Paper white defines how bright white should be, for example in a controlled dark environment like a movie theater, 80 nits is typically used, in contrast to a PC monitor which could be 220 nits (one "nit" is short for one candela per square meter, and is a unit of light intensity)."
Indeed, most SDR transfer functions practically operate in terms of the display's "native" white rather than an absolute luminance like PQ defines. Technically sRGB does define a white of 80 nits, but in practice few people use it this way, and even the graph is not set to this level. Furthermore, sRGB does not define a non-zero black point, and a zero black point was not a reasonable representation of what consumer displays could do until the relatively recent advent of OLEDs. Though some other SDR standards do define or take into account a display black point, such as BT. 1886 which to my understanding is the most popular current display standard for SDR video. Perhaps display black point is where the "0.1 nits" of the graph came from. Either that or they took the value of the first 8-bit code point above zero for a white level equal to a typical-ish PC monitor. But "0.1 nits" is not in the sRGB definition itself.
Ultimately it's not possible to strictly say whether "HDR" involves "deeper blacks" or not unless you specify whether you are referring to capture, production, encoding, display, etc. and which specific standards, products, etc. you are comparing. That said, if I were forced to pick either "yes" or "no" with no elaboration, I would say "yes, HDR means deeper blacks". Otherwise, you could just as easily argue that, since in theory you could display a SDR image on a display with a native white point of 10,000 nits, or that you can't get more than 400 nits out of a 400-nit "HDR" display, HDR doesn't mean brighter whites either! --Ajul1987 (talk) 23:40, 7 January 2020 (UTC)[reply]
The CONSENSUS is as follows. 80 nits were redefined to be 100 nits for sRGB SDR in black body ambient light and other ambient lights have other nits value (I know only about standard illuminants but in UV light, e.g., there also should be the correct transfer). Every display that does not conform to this is broken and it is most of them unless you combine LG CX TV in Filmmaker mode + True Tone mode of Apple. That still did not happen. Shameful, if you ask me. The black point should also be different in those two cases (HDR vs SDR). But here the CONSENSUS is the opposite, display black in SDR can be as black in HDR as this does not really matter! It can matter only on (AM)OLED display. Deeper blacks has nothing to do with black point itself. The deeper blacks are just the consequence of PQ function standard, that is how it works. 2A00:1370:812C:DA6:1B2:6C07:3ECD:310D (talk) 14:30, 10 April 2020 (UTC)[reply]
They did not redefine anything. BT.470, BT.601, BT.709 and BT.2020 (BT.2020 without BT.2100, that is SDR) all are scene differed. While sRGB is display reffered. 80 nits for sRGB is just by defintion of the EOTF, while scene differed video uses BT.1886 as an EOTF and 100 nits is also not really defined, BT.1886 depends on black point and white point. 2A00:1370:812D:F205:C0A5:95D7:D09C:EE8C (talk) 16:25, 18 April 2021 (UTC)[reply]

Draft

Draft:HDR10+ was submitted for review at AFC. This would result in being a main article for the HDR10+ section so if someone could look at the draft it would be appreciated. Otr500 (talk) 13:37, 19 November 2019 (UTC)[reply]

Some paragraphs are not related to this article. Should be moved to the "HDR imaging" article

The paragraphs under the sub-heading "Capture" and the paragraph under the sub-heading "Production" are not related to this article. This article talk about the HDR formats allowing to store and distribute HDR videos and images such as HDR10 and Dolby Vision. This should not be confused with the photography technique also called "HDR" allowing to expanse the dynamic range captured by digital camera. The paragraphs should be moved to the article "HDR imaging". — Preceding unsigned comment added by Ff4ever (talkcontribs) 15:35, 31 January 2021 (UTC)[reply]

I see that you already did not move that info to HDR imaging, I think that it is important to not move it there since that article is not that visited as this one and since it is more about HDR inside SDR conatiner, which is not about range, but about faking range on monitors that do not limit SDR stuff to 80 nits for sRGB and to 100 (or 203) nits for BT.709 stuff (which is as you should be aware most of 'em, except LG in filmmaker mode). 109.252.90.92 (talk) 15:41, 17 April 2021 (UTC)[reply]

Color primaries illustration needed

An illustration of the three color primaries sets related to HDR is needed for the section about chromaticity. There are already individual illustrations for Rec.2020/Rec.2100, DCI-P3 and Rec.709/sRGB. An illustration showing both three in the same picture is needed. Ff4ever (talk) 12:18, 23 April 2021 (UTC)[reply]

That DCI-P3 is actually DCI-D65. 2A00:1370:812D:F205:B475:A8D:A0FA:9539 (talk) 19:38, 23 April 2021 (UTC)[reply]

HDR/WCG in film and in Photo CD, 8 bit HDR (Nvidia GPUs and files), no HDR BT.2020 container in Netflix

So that is to prevent edit warring with you know who. 2A00:1370:812D:EB35:380D:2268:F7CE:5D8 (talk) 11:47, 1 May 2021 (UTC)[reply]

8 bit HDR

First of all: 8 bit HDR in AVC is possible and is supported by LG C9, it triggers HDR. One just have to tag the file as PQ transfer and 8 bit to 10 bit is defined by BT.709 even. Sample is here: https://disk.yandex.ru/i/ZKd0INUrpHtoDg I will also point out that 10 bit is not required by DisplayHDR (but accepting HDR10 is) and I wrote about it in this very article. Nvidia and Windows HD color menu does support 8 bit HDR, see screenshot here. https://www.avsforum.com/threads/2020-lg-cx%E2%80%93gx-dedicated-gaming-thread-consoles-and-pc.3138274/post-60688621 2A00:1370:812D:EB35:380D:2268:F7CE:5D8 (talk) 11:47, 1 May 2021 (UTC)[reply]

HDR and WCG

Next. Netflix does not use BT.2020 container for production. It uses Display P3 mxf with JPEG 2000 instead with PQ transfer and converts for BT.2020 container for streaming. There are even reliable sources in DCI-P3 article. See sample here: https://trac.ffmpeg.org/ticket/9145 and full here https://opencontent.netflix.com/ (Sol Levante). Also, there are not too many movies that support outside DCI-P3. That is Planet Earth II and latest Star wars. Next. HDR/WCG tech is very old. 2A00:1370:812D:EB35:380D:2268:F7CE:5D8 (talk) 11:47, 1 May 2021 (UTC)[reply]

HDR film

For example, film always supported HDR, an example is 1968 (!) movie 2001: A Space Odyssey, it was filmed in film that supports HDR and now is presented as such on Dolby Vision Blu-ray. 2A00:1370:812D:EB35:380D:2268:F7CE:5D8 (talk) 11:47, 1 May 2021 (UTC)[reply]

xvYCC and sYCC

xvYCC supported WCG too, as it is mandatory limited range, it used Cb, Cr outside 16-240 to encode WCG (that is outside BT.709). The same can be done for sRGB and is called sYCC. 2A00:1370:812D:EB35:380D:2268:F7CE:5D8 (talk) 11:47, 1 May 2021 (UTC)[reply]

Photo CD

Next: Photo CD also supported HDR, not only WCG. WCG was supported by the same means as xvYCC, i.e. the very same extended transfer function. But it also used different quantization to preserve highlights for HDR, Superwhite was used to show it on TV. Superwhite is still supported, even by LG C9. I will quote from Photo CD article: "However, in practice the color space of Photo CD images varies significantly from Rec. 709. Firstly, the Photo CD encoding scheme allows greater than 100% values for color components, thus allowing Photo CD images to display colors outside of the nominal Rec. 709 gamut". Now, there was a lot of theoretical work to create PQ function. It is mostly Barten Ramp function, i.e. ITU-R Report BT.2246. All of this should be mentioned in the article, if we do not wanrt to be some 2nd level BS source. 2A00:1370:812D:EB35:380D:2268:F7CE:5D8 (talk) 11:47, 1 May 2021 (UTC)[reply]