Hi,
Today I tested Max Payne which uses D3D8. Before I started the D3D8 -> WineD3D transistion the game worked fine. Right now it doesn't work correct anymore when you let the game use 16bit textures in its setup menu. Right now all textures are missing. The problem appears to be a glTexImage2D call in wined3d's LoadTexture (surface.c) which results in a 'invalid operation'. Similar code worked fine before the d3d8 transition.
After analysing the functions it appears that in 16bit the game wants to use D3DFORMAT_R5G6B5. In case of d3d8 this is translated into GL_RGB while in case of wined3d GL_BGR is used. (Oliver changed it at some stage, he changed other formats too claiming those were incorrect but no more explanation than that)
I'm using the latest nvidia drivers and the GL_EXT_BGRA extension is present so in theory GL_BGR should work but for some reason. (Perhaps it is a driver bug as I found similar issues on google regarding both the drivers from ati and nvidia. I wasn't able to find a solution) If I change GL_BGR into GL_RGB I don't get an invalid operation and the game works fine (the textures look well too).
According to MSDN the format is just the internal format. Someone in #winehackers told me that Windows prefers BGR because it is more efficient. The format is for internal use and second the opengl driver does the conversion for us anyway. http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/directx9_c/...
What shall I do, just change the behaviour back to GL_RGB which works? Second the issue likely appears for some other formats which Oliver changed.
Regards, Roderick