benoitvm
2015-02-10 23:41:25 ( ID:iycpwrfd3yk )
[ Delete / Reply with quotation ]
Hi
I wonder how 10-bit sources are subsampled to 8-bit during conversion ?
Just plain stripping of the LSB ?
What implications on how to best prepare the 10-bit source (dithering ?)
Maybe inrelated to the above Q, but I was making some tests with a jpeg2000 encoded video (4.2.2, 10bit, Avid implementation) in a QuickTime MOV container, and noticed that the luma range was incorrectly detected by TVMW (needs significant contrast & brightness boost)
|