H. Verbeet wrote:
On 06/05/06, Ivan Gyurdiev ivg2@cornell.edu wrote:
[ applies on top of Mike's tree, but hopefully on the main tree too ]
R8G8B8 means red is the most significant bit. On a little-endian system, the texture is stored starting with blue. It is read byte-by-byte (GL_UNSIGNED_BYTE), therefore it needs to be flipped to GL_BGR.
This makes Demo #2 (texture mapping: http://www.zanir.szm.sk/dx00-09.html) show the correct colors. Also, in HL2, it makes Barney's face human color (previously it was blue). I think Stefan Dosinger saw improvement on some other demos.
There might be more to this. This was changed the other way around not too long ago. See http://source.winehq.org/git/?p=wine.git;a=commit;h=252c4adb965a26db19c1c916...
Well, I haven't tested Age of Mythology - maybe Aric can see if the patch breaks it [maybe check exactly which formats are used in that game?]
The way I interpret it currently, R3G3B2 would also be broken by reversal to GL_RGB [ here we're reading a single byte, so the fields read should be RGB-order imho, but then are reversed by the _REV suffix (GL_UNSIGNED_BYTE_2_3_3_REV), so the end format appears to be BGR)... but maybe I'm misunderstanding how this works.
It might just be that the glDrawPixels call should've been changed instead last time. (Shouldn't those simply use the format in This->glDescription.glFormat in the first place?)
I think it should. I have a patch that does exactly this, but not sure how to test it (render targets?). Other things in there look wrong (not matching the ones in LockRect that come from utils.c).