Hi,
The past few days I have experimented with Oliver his d3d8 patches. As mentioned before the patches work fine except Shaders have problems. There's one texture/surface related problem I'm trying to debug but I don't know how to proceed nor how to fix the problem.
When you create a texture in d3d8 you can specify in what type of memory it needs to be stored. There are roughly two options of which one is to store the textures in video memory and the other in system memory. Storing of textures in system memory has limitations compared to video memory. The main one is that textures placed in system memory can't be directly used for rendering. In order to render the texture for instance need to copy the data to another texture (one which isn't stored in system memory).
The problem I'm having appears in 3dmark2001 which creates some textures in system memory and then it directly renders them using DrawPrimitive. This works on Windows. Based on the information (from MSDN) which I gave before this is illegal but somehow it works. In case of wined3d we check in our LoadTexture call if the texture is in system memory and if it is, we don't render it. (which is ok according to MSDN) This 'correct' piece of code causes that no textures appear in 3dmark, removing the check fixes 3dmark.
I tried to test the behaviour of textures in system memory on windows using some multitexturing sample from codesampler.com. By default the sample uploaded textures to video memory and I changed it to system memory. Directly rendering textures from system memory didn't work as expected from the MSDN docs. (BTW the demo doesn't work correctly on cedega, they seem to allow rendering of textures in system memory. I guess this is a bug but perhaps this is a 'feature') The sample also works correctly on wine in case of the d3d8->wined3d patches, on the current d3d8 it would be as broken as on Cedega. (The demo is available from http://roderick.student.utwente.nl/sample.tar.gz from the included files dx8_multitexture.exe or whatever is the original sample and sample.exe is the one which uses textures from system memory)
To summarize 3dmark2001 possibly uses incorrect d3d8 code but the code works fine on Windows. I tried to test the behaviour using a test (perhaps not a good test) and windows appeared to work correctly and the same for wine. Further oliver had the same problem in maxpayne2 (it uses the same 3d engine I think) and he used a similar hack.
Does someone has some good ideas for locating the problem? I think my d3d8 test is good enough but perhaps it isn't. I have uploaded a log of 3dmark to http://roderick.student.utwente.nl/log.bz2 The problem appears near the 'LoadTexture' lines for textures created with 'pool 2' (D3DPOOL_SYSTEMMEM).
Regards, Roderick
Hi,
To summarize 3dmark2001 possibly uses incorrect d3d8 code but the code works fine on Windows. I tried to test the behaviour using a test (perhaps not a good test) and windows appeared to work correctly and the same for wine. Further oliver had the same problem in maxpayne2 (it uses the same 3d engine I think) and he used a similar hack.
One thing that I've learned since I started working on Wine is not to trust the MSDN. During my ddraw/d3d work I've seen a lot of such things, and the ddraw code contains a lot of comments which state that the msdn is incorrect. And at the university I've met some windows application developers which thought that the msdn is horrible.
In this case, what speaks against simply allowing rendering from sysmem textures? In wine this doesn't make much difference(as we 'only' forward to GL).
For Windows I'd say that this might depend on the driver. Perhaps in MS design sysmem textures can't be used for rendering, but most drivers eighter ignore the sysmem property completely(and place everything into vidmem), or they are just able to render from sysmem textures. If you are unlucky, you might have a driver which can't do so, and some apps are broken.
On Sunday 05 February 2006 22:44, Stefan Dösinger wrote:
Hi,
To summarize 3dmark2001 possibly uses incorrect d3d8 code but the code works fine on Windows. I tried to test the behaviour using a test (perhaps not a good test) and windows appeared to work correctly and the same for wine. Further oliver had the same problem in maxpayne2 (it uses the same 3d engine I think) and he used a similar hack.
One thing that I've learned since I started working on Wine is not to trust the MSDN. During my ddraw/d3d work I've seen a lot of such things, and the ddraw code contains a lot of comments which state that the msdn is incorrect. And at the university I've met some windows application developers which thought that the msdn is horrible.
In this case, what speaks against simply allowing rendering from sysmem textures? In wine this doesn't make much difference(as we 'only' forward to GL).
For Windows I'd say that this might depend on the driver. Perhaps in MS design sysmem textures can't be used for rendering, but most drivers eighter ignore the sysmem property completely(and place everything into vidmem), or they are just able to render from sysmem textures. If you are unlucky, you might have a driver which can't do so, and some apps are broken.
Hi,
No, i think "system memory pools" is only used as optimisation flag by drivers. For me it means that Game developer use: - "video memory pools": preference for texture to be always in video card mem. - "system memory pools" : your texture may be not always in video card mem
So drivers should load first "video mem pool" and if GC mem remains "system mem pool" (who can be swapped out when needed)
It think many drivers also do some optimisations as tagging as "video memoy pooled" some intensivelly used textures (and untagging textures that are never used)
It's why Oliver done a lot of work about "memory pooling" but now we need a "memory" scheduler (knowing size of GC mem) :)
Keep you good job :)
Regards, Raphael
There does exist a D3DDEVCAPS_TEXTURESYSTEMMEMORY property. http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/dx81_c/dire... However, I don't think wine sets it. I wonder if perhaps SetTexture should return a specific error code if a video card doesn't support that. Ie, perhaps 3DMark01 just attempts to use D3DPOOL_SYSTEMMEM textures, and falls back to something else if it fails to set them. That's just a guess of course.
On Monday 06 February 2006 09:37, H. Verbeet wrote:
There does exist a D3DDEVCAPS_TEXTURESYSTEMMEMORY property. http://msdn.microsoft.com/archive/default.asp?url=/archive/en-us/dx81_c/dir ectx_cpp/Graphics/reference/CPP/D3D/Structures/d3dcaps8.asp However, I don't think wine sets it. I wonder if perhaps SetTexture should return a specific error code if a video card doesn't support that. Ie, perhaps 3DMark01 just attempts to use D3DPOOL_SYSTEMMEM textures, and falls back to something else if it fails to set them. That's just a guess of course.
The problem was indeed related to device caps. Wine didn't set any caps. It didn't even set the capability that the card has video memory. Because of that the application tried to use system memory as video memory which is allowed when D3DDEVCAPS_TEXTURESYSTEMMEMORY is set. I have added some caps and also some checks to SetTexture to fix the problems.
Roderick