În ziua de Joi 29 Mai 2014, la 10:56:57, ați scris:
Am 2014-05-29 01:33, schrieb Andrei Slăvoiu:
Drop the check for glsl_version. All wine shaders use #version 120 so it doesn't matter.
The point of this code is to check the capabilities of the card, not the capabilities of our d3d implementation. Thus to prevent SM 3 cards from being reported with PCI IDs that suggest SM 4 capabilities you need to check for all the other functionality that's supported by EXT_gpu_shader4 / GLSL 130.
Even if we have a SM 4 capable card our d3d9 implementation does not expose SM 4 support. But neither does Microsoft's d3d9 implementation.
Correct me if I'm wrong, but the code that decides what shader model d3d9 exposes is shader_glsl_get_caps which looks like this: if (gl_info->supported[EXT_GPU_SHADER4] && gl_info-
supported[ARB_SHADER_BIT_ENCODING]
&& gl_info->supported[ARB_GEOMETRY_SHADER4] && gl_info-
glsl_version >= MAKEDWORD_VERSION(1, 50)
&& gl_info->supported[ARB_DRAW_ELEMENTS_BASE_VERTEX] && gl_info-
supported[ARB_DRAW_INSTANCED])
shader_model = 4; /* ARB_shader_texture_lod or EXT_gpu_shader4 is required for the SM3 * texldd and texldl instructions. */ else if (gl_info->supported[ARB_SHADER_TEXTURE_LOD] || gl_info-
supported[EXT_GPU_SHADER4])
shader_model = 3; else shader_model = 2;
So wine's d3d9 will expose SM 3 with just glsl 1.20 and GL_ARB_shader_texture_lod. Or am I missing something?
There's also GLX_MESA_query_renderer. It gives us the PCI IDs and video memory size directly, without all the string parsing guesswork. We cannot use it wined3d directly. A possible approach would be to expose a similar WGL_WINE/MESA_query_renderer extension from winex11.drv and winemac.drv and use that in wined3d. The current wined3d guesswork code could be moved to winex11.drv and used in cases where GLX_MESA_query_renderer is not supported. (OSX has similar functionality that's always available)
I was wondering what it would take to get rid of all this guessing and get the PCI ID directly, thanks for the pointers. I'll look into it after I get a better understanding of the existing code.
User32 also has a function (EnumDisplayDevices) to query the GPU identification string, and some applications (Company of Heroes demo) fail if the user32 GPU name differs from the d3d9 GPU name. So maybe a WGL extension isn't quite the right interface, and it should be something that does not require a GL context so user32.dll can use it as well. EnumDisplayDevices does not export all the information wined3d needs though - the PCI IDs and video memory size are missing I think.
So use the string provided by EnumDisplayDevices and the PCI ID and memory size from the WGL extension?