Am Freitag, 6. August 2021, 17:01:48 EAT schrieb Henri Verbeet:
There is no Metal backend in upstream Wine ;)
Woops, all that mac work has brain damaged me :-)
It seems a little awkward to tie this to WINED3D_PIXEL_CENTER_INTEGER. Would it make sense to instead detect the filling convention during adapter initialisation, so that we can get rid of this for d3d9 and before as well if possible? We could then just store the filling convention offset in the wined3d_d3d_info structure.
Yeah I played with the thought of detecting it at adapter init, but for that I'd need a card that has different behavior. Otherwise I am just shooting blind. The ones I tested (AMD Radeon 560; Geforce 650M; Intel HD 4000; Intel HD graphics 615; Apple M1; - the last 3 only on MacOS, the others Mac and Linux) behaved uniformly. I remember back in the Geforce 7/8 days we had GPU specific issues. Unfortunately my Geforce 7s all died and my r500 card is a long distance away.
Afaics we have no test that tests if we are doing the right thing in d3d9 and earlier. I'll add one, that should hopefully give some clues if the 63.0/128.0 is still correct on today's GPUs or if we need a flat 1.0/2.0 on some. If it's the former - and I am not aware of any d3d <= 9 games that have any pixel boundary issues right now - there might be some d3d9/d3d10 difference.