2015-03-03 12:22 GMT+01:00 Matteo Bruni matteo.mystral@gmail.com:
For the 8-bit format you mean that 1 is not representable in the signed u/v components, correct? Also "singed" typo, but otherwise good to me.
1 and -1 (the signed input value, not the post-scaled -1.0 and 1.0) are missing from U and V, yeah. The reason is that this way I can reuse the expected color array. Q in D3DFMT_Q8W8V8U8 tests some 8 bit specific values. I'll see if I can clarify that comment a bit.
{0x4067, 0x53b9, 0x0421, 0xffff}, /* 0x4067, 0x53b9, 0x0421, 0xffff */
{0x8108, 0x0318, 0xc28c, 0x909c}, /* 0x8108, 0x0318, 0xc28c, 0x909c */
- };
What did you mean to put in the comments here? Those currently there match the actual content of the array.
I thought I removed those comments. It was a sort of backup of the values while I experimented with the problem Geforce 7 cards have.
This is clearly my brain failing right now, but why is blue expected to be 0xff when the texture component is missing?
All formats return 1.0 for channels that aren't defined by the format. E.g. D3DFMT_G32R32F returns R, G, 1.0, 1.0. That differs from GL, which returns 1.0 for alpha and 0.0 for color channels.
Cool additional test for UpdateSurface, thank you for that. Here and in the previous test I guess you might want to put the slop_broken check into a broken(), unless that also fails on Linux.
I also think I fixed that before sending. I wonder if I forgot a git commit --amend...