That said, GL_ARB_vertex_blend is not very common. Only MacOS supports it, and it is software emulated there. The Linux ATI driver used to support it until ATI switched to a different codebase. So this constant ends up beeing 0 almost everywhere. The extension was dropped in favor of vertex programs, since vertex programs appeared shortly after vertex blending and no gl app was using it at this time. On Windows drivers emulate vertex blending using vertex shaders.
I am not sure if 0 is the correct value to return in case of no vertex blending support. What happens if you set it to 1? 1 would be a reasonable default as well, because we're still able to transform vertices with one matrix(ie, unblened transformation). It could also be that this game silently expects vertex blending to be supported: For one part, all dx8+ cards support this feature on Windows, and on older cards Windows has software vertex processing which can perform blending in a slow, but working fashion.
I have a patch for software emulated vertex blending somewhere, but I dropped it because I didn't come across an app which uses this function. Maybe its time to revive it, and continue the work on a fixed function replacement shader.
Even on Windows, no OpenGL implementation that I know of supports GL_ARB_vertex_blend other than ATI's (and as you've pointed out, the new OpenGL stack that the Radeon HD's use on Windows and that all Radeons now use on Linux has dropped it).
History lesson: Both GL_ARB_vertex_blend and D3D vertex blending are historically tied to a specific piece of hardware: the original ATI Radeon (GL_ARB_vertex_blend started out as GL_ATI_vertex_blend). Presumably, due to ATI's historically spotty OpenGL driver performance (okay, "historically" is being kind to ATI), any game developer who wanted to use the Radeon's capabilities targeted their game for D3D rather than OpenGL. Consequently, when nVidia released the GeForce 3 (the first video card with vertex shaders) they did implement D3D vertex blending via shaders in their D3D driver (since games were using it, and it wouldn't do if the "new" GF3 couldn't handle effects that the "old" Radeon could), but they didn't bother to implement it in their OpenGL driver, even though ATI had gotten it ARB-approved some months before the GF3 came out, since no games/apps were using it. The other video hardware makers (the few that were left by the DX8 era) followed NV's example: implement D3D vertex blending via shaders or in software (for entry-level and onboard hardware without vertex shaders), but don't bother with GL_ARB_vertex_blend since no one uses it.
The upshot is that GL_ARB_vertex_blend is an ARB extension In Name Only; it's effectively ATI-only--and even ATI has abandoned it now. On the other hand, D3D vertex blending is universally available on modern and semi-modern hardware regardless of maker, and it looks like even some D3D9 games still use it instead of using vertex shaders. So it's better for WINE to implement vertex blending via vertex programs rather than depending on GL_ARB_vertex_blend.
--AWJ--
_________________________________________________________________
Alex Jackson escribió:
That said, GL_ARB_vertex_blend is not very common. Only MacOS
supports it, and
it is software emulated there. The Linux ATI driver used to support
it until
ATI switched to a different codebase. So this constant ends up beeing 0 almost everywhere. The extension was dropped in favor of vertex
programs,
since vertex programs appeared shortly after vertex blending and no
gl app
was using it at this time. On Windows drivers emulate vertex blending
using
vertex shaders.
I am not sure if 0 is the correct value to return in case of no vertex blending support. What happens if you set it to 1? 1 would be a
reasonable
default as well, because we're still able to transform vertices with one matrix(ie, unblened transformation). It could also be that this game
silently
expects vertex blending to be supported: For one part, all dx8+ cards
support
this feature on Windows, and on older cards Windows has software vertex processing which can perform blending in a slow, but working fashion.
I have a patch for software emulated vertex blending somewhere, but I
dropped
it because I didn't come across an app which uses this function.
Maybe its
time to revive it, and continue the work on a fixed function replacement shader.
Even on Windows, no OpenGL implementation that I know of supports GL_ARB_vertex_blend other than ATI's (and as you've pointed out, the new OpenGL stack that the Radeon HD's use on Windows and that all Radeons now use on Linux has dropped it).
History lesson: Both GL_ARB_vertex_blend and D3D vertex blending are historically tied to a specific piece of hardware: the original ATI Radeon (GL_ARB_vertex_blend started out as GL_ATI_vertex_blend). Presumably, due to ATI's historically spotty OpenGL driver performance (okay, "historically" is being kind to ATI), any game developer who wanted to use the Radeon's capabilities targeted their game for D3D rather than OpenGL. Consequently, when nVidia released the GeForce 3 (the first video card with vertex shaders) they did implement D3D vertex blending via shaders in their D3D driver (since games were using it, and it wouldn't do if the "new" GF3 couldn't handle effects that the "old" Radeon could), but they didn't bother to implement it in their OpenGL driver, even though ATI had gotten it ARB-approved some months before the GF3 came out, since no games/apps were using it. The other video hardware makers (the few that were left by the DX8 era) followed NV's example: implement D3D vertex blending via shaders or in software (for entry-level and onboard hardware without vertex shaders), but don't bother with GL_ARB_vertex_blend since no one uses it.
The upshot is that GL_ARB_vertex_blend is an ARB extension In Name Only; it's effectively ATI-only--and even ATI has abandoned it now. On the other hand, D3D vertex blending is universally available on modern and semi-modern hardware regardless of maker, and it looks like even some D3D9 games still use it instead of using vertex shaders. So it's better for WINE to implement vertex blending via vertex programs rather than depending on GL_ARB_vertex_blend.
--AWJ--
I am a bit lost now, but then, for a fast recap:
1- Theres some programs that call ConvertToIndexedBlendedMesh at d3d9x_30.dll 2- wine returns (from opengl) MaxVertexBlendMatrices=0 3- d3d9x_30.dll misbehaves and generates vertex declarations with odd type codes and offsets when MaxVertexBlendMatrices=0 4- when we hack wine at IDirect3DDevice9Impl_GetDeviceCaps to return 4 d3d9x_30.dll works properly and the programs run right
So my question is: is there any reason to not have wine return MaxVertexBlendMatrices=4 (as it works regardless of what opengl reports), and if so is it any other way for people willing to use some programs (like Everquest) to 'hack' it's value than modify IDirect3DDevice9Impl_GetDeviceCaps manually to make it return 4
Am Sonntag, 10. Februar 2008 11:27:22 schrieb Julio Fernandez:
So my question is: is there any reason to not have wine return MaxVertexBlendMatrices=4 (as it works regardless of what opengl reports), and if so is it any other way for people willing to use some programs (like Everquest) to 'hack' it's value than modify IDirect3DDevice9Impl_GetDeviceCaps manually to make it return 4
Returning 4 makes Everquest happy, but will break other apps.
I think the only way to fix this properly is to complete Henri's fixed function pipeline replacement shader in order to implemen vertex blending using shaders. This is major task unfortunately.