![]() This is the test pattern I created and used to measure all this: Įdit: I was wrong about the artifacts, I did more testing and comparisons and they definitely only happen in 10 bit mode, and are also visible in 10 bit video that shadowplay records, but do not exist in reality. (*see edit) I did use highest possible quality recording to minimize the contribution of compression artifacts. I can't be 100% sure however that these imperfections are issues with windows SDR in HDR implementation, or if they are artefacts or issues caused by Nvidia shadowplay HDR recording in windows, but one of those two companies definitely messed up, and my money is on Microsoft. Additionally red and blue channel map 8 bit value of 0 to 0.0004 nits, instead of 0 nits like green channel does. What this means is that SDR in HDR has a slight white balance offset towards the purple. ![]() Green channel also terminated at 100 nits (100.2210 to be precise), while red and blue went up to 106 nits. I measured this using green channel only, as red and blue channels had an offset and were considerably less accurate, even dipping downwards at RGB 109 / 44 nits. Here is the google sheet I created and used to do this which contains all the measurement details and formulas: For example 8 bit values 249 and 250 both produce output of 94.4889 nits, and such repetitions occur 12 times in total. Shades just repeat sometimes, and more so towards the highlights. And even with 12 bit enabled I still measured some quantization artifacts, that you can see as the juddering in the blue line on the graphs above. I measured this with HDR/SDR brightness balance in Windows set to 5 to raise peak brightness from 80 to 100 nits, and with output in Nvidia Control Panel set to 12 bit, as I noticed that setting it to 12 bit instead of 10 produces a noticeably higher quality SDR gradient (even though this screen itself doesn't support displaying 12 bit, it does support 12 bit input which made this possible to enable). exr that I exported from Davinci Resolve. That said, in the name of science I'll mention that used Sony A95K as my display, even though that shouldn't matter as I wasn't measuring nits on the display, I was instead calculating the nit values by measuring PQ values captured by shadowplay and stored inside of. These measurements are monitor agnostic (or at least should be) since shadowplay is capturing the data before it is sent to the screen. Here are 4 versions of the graph comparing my measurements against gamma 2.4, 2.2 and sRGB piecewise, where it can clearly be seen that the measured curve near perfectly matches sRGB piecewise gamma (most easily noticeable in log-linear and log-log version of the graph) Tl dr: Windows 10 is using srgb (piecewise) gamma for SDR content in HDR. I wanted to know this as well, couldn't find info so I went and measured it myself using nvidia shadowplay and davinci resolve.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |