On 13 February 2014 23:10, Stefan Dösinger stefan@codeweavers.com wrote:
* Unfortunately different implementations(Windows-NV and Mac-AMD tested) interpret some colors vastly
* differently, so we need a max diff of 18. */
And actually, is there any chance the "vastly differently" here is because of using BT.601 vs. BT.709 coefficients, or not including the headroom/footroom scaling + offsets?
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Am 2014-02-14 20:24, schrieb Henri Verbeet:
On 13 February 2014 23:10, Stefan Dösinger stefan@codeweavers.com wrote:
* Unfortunately different implementations(Windows-NV and Mac-AMD tested) interpret some colors vastly
* differently, so we need a max diff of 18. */
And actually, is there any chance the "vastly differently" here is because of using BT.601 vs. BT.709 coefficients, or not including the headroom/footroom scaling + offsets?
Maybe, but I can't look into either driver. If I remember correctly I got the reference values from the GeForce 7 driver on Windows and our shader results match it pretty well (plus / minus 2 or something). Geforce 8+ cards on Windows are off pretty far, and the GL_APPLE_ycbcr_422 support on OSX deviated even more.
I think we're also doing some things wrong with the coefficients in our shader. I read about a common pitfall when handling 8 bit integer chroma that I thought we're falling into, but I don't remember the details.