On Mon Jun 20 22:00:21 2022 +0000, Matteo Bruni wrote:
It seems pretty clear and reasonable to me actually. It looks like ChoosePixelFormat() returns a pixel format with stencil bits when requested so. Matching depth bits comes at a lower priority. As it turns out, the only actual pixel format with stencil bits that's supported everywhere is D24S8. D32S8 is apparently supported on Windows AMD, but not on Windows Nvidia. FWIW Linux AMD doesn't return any D32 visual / fbconfig at all. I'm attaching some more changes on top of yours: [choosepixelformat.txt](/uploads/b892cfcd3cb9fb13f8eee3ab81b2ddfb/choosepixelformat.txt). Basically I swapped the two blocks for stencil and depth in wglChoosePixelFormat(), making sure we take care of stencil before checking depth.
Also adding the output of the tests on Windows Nvidia after my patch.
[output_nvidia.txt](/uploads/99fd7148e6a77bfea79713e71aa0a03b/output_nvidia.txt)