I tried to play Supreme Commander using pbuffer option instead of fbo. I was quite happy with it, since I gained quite a bunch of performance (I mean, something I really COULD see), but after a while, the performance dropped dramatically, to ~4-5 fps.
I tested quite a few thing, and I finally found that pixel bufers were not taken in account when calculating available texture memory. The game then allocated more textures, and good opengl didn't dare complain when putting them in system memory.
Attached is a patch which should solve the problem.
For those who are curious, try setting VideoMemorySize to 200 instead of 256. It works just like a charm.
From 8e7b7e517b15e2ddb2cdd1526dfab3dfbf856bd5 Mon Sep 17 00:00:00 2001
From: =?utf-8?q?J=C3=A9r=C3=B4me=20Gardou?= jerome.gardou@laposte.net Date: Sun, 25 Jan 2009 02:34:03 +0100 Subject: [PATCH] wined3d: take pixel buffers in account when calculating texture ram.
--- dlls/wined3d/context.c | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-)
diff --git a/dlls/wined3d/context.c b/dlls/wined3d/context.c index da3053a..9fad82d 100644 --- a/dlls/wined3d/context.c +++ b/dlls/wined3d/context.c @@ -898,6 +898,9 @@ WineD3DContext *CreateContext(IWineD3DDeviceImpl *This, IWineD3DSurfaceImpl *tar } This->frag_pipe->enable_extension((IWineD3DDevice *) This, TRUE);
+ if(create_pbuffer) + WineD3DAdapterChangeGLRam(This, ((IWineD3DSurfaceImpl*)(ret->surface))->currentDesc.Width * ((IWineD3DSurfaceImpl*)(ret->surface))->currentDesc.Height * ((IWineD3DSurfaceImpl*)(ret->surface))->bytesPerPixel) ; + return ret;
out: @@ -988,6 +991,7 @@ void DestroyContext(IWineD3DDeviceImpl *This, WineD3DContext *context) { if(context->isPBuffer) { GL_EXTCALL(wglReleasePbufferDCARB(context->pbuffer, context->hdc)); GL_EXTCALL(wglDestroyPbufferARB(context->pbuffer)); + WineD3DAdapterChangeGLRam(This,(-1) * ((IWineD3DSurfaceImpl*)(context->surface))->currentDesc.Width * ((IWineD3DSurfaceImpl*)(context->surface))->currentDesc.Height * ((IWineD3DSurfaceImpl*)(context->surface))->bytesPerPixel) ; } else ReleaseDC(context->win_handle, context->hdc); pwglDeleteContext(context->glCtx);
Jerome,
your patch must be sent to wine-patch, not to wine-devel.
A+
David
--- En date de : Dim 25.1.09, Jérôme Gardou jerome.gardou@gmail.com a écrit : De: Jérôme Gardou jerome.gardou@gmail.com Objet: performance issue when OffscreenRenderingMode = "pbuffer" À: wine-devel@winehq.org Date: Dimanche 25 Janvier 2009, 2h42
I tried to play Supreme Commander using pbuffer option instead of fbo. I was quite happy with it, since I gained quite a bunch of performance (I mean, something I really COULD see), but after a while, the performance dropped dramatically, to ~4-5 fps.
I tested quite a few thing, and I finally found that pixel bufers were not taken in account when calculating available texture memory. The game then allocated more textures, and good opengl didn't dare complain when putting them in system memory.
Attached is a patch which should solve the problem.
For those who are curious, try setting VideoMemorySize to 200 instead of 256. It works just like a charm.
From 8e7b7e517b15e2ddb2cdd1526dfab3dfbf856bd5 Mon Sep 17 00:00:00 2001
From: =?utf-8?q?J=C3=A9r=C3=B4me=20Gardou?= jerome.gardou@laposte.net Date: Sun, 25 Jan 2009 02:34:03 +0100 Subject: [PATCH] wined3d: take pixel buffers in account when calculating texture ram.
--- dlls/wined3d/context.c | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-)
diff --git a/dlls/wined3d/context.c b/dlls/wined3d/context.c index da3053a..9fad82d 100644 --- a/dlls/wined3d/context.c +++ b/dlls/wined3d/context.c @@ -898,6 +898,9 @@ WineD3DContext *CreateContext(IWineD3DDeviceImpl *This, IWineD3DSurfaceImpl *tar } This->frag_pipe->enable_extension((IWineD3DDevice *) This, TRUE);
+ if(create_pbuffer) + WineD3DAdapterChangeGLRam(This, ((IWineD3DSurfaceImpl*)(ret->surface))->currentDesc.Width * ((IWineD3DSurfaceImpl*)(ret->surface))->currentDesc.Height * ((IWineD3DSurfaceImpl*)(ret->surface))->bytesPerPixel) ; + return ret;
out: @@ -988,6 +991,7 @@ void DestroyContext(IWineD3DDeviceImpl *This, WineD3DContext *context) { if(context->isPBuffer) { GL_EXTCALL(wglReleasePbufferDCARB(context->pbuffer, context->hdc)); GL_EXTCALL(wglDestroyPbufferARB(context->pbuffer)); + WineD3DAdapterChangeGLRam(This,(-1) * ((IWineD3DSurfaceImpl*)(context->surface))->currentDesc.Width * ((IWineD3DSurfaceImpl*)(context->surface))->currentDesc.Height * ((IWineD3DSurfaceImpl*)(context->surface))->bytesPerPixel) ; } else ReleaseDC(context->win_handle, context->hdc); pwglDeleteContext(context->glCtx);
I tried to play Supreme Commander using pbuffer option instead of fbo. I was quite happy with it, since I gained quite a bunch of performance (I mean, something I really COULD see), but after a while, the performance dropped dramatically, to ~4-5 fps.
I tested quite a few thing, and I finally found that pixel bufers were not taken in account when calculating available texture memory. The game then allocated more textures, and good opengl didn't dare complain when putting them in system memory.
Attached is a patch which should solve the problem.
For those who are curious, try setting VideoMemorySize to 200 instead of 256. It works just like a charm.
I think the basic idea of the patch is good but the calculation itself should take into account double buffering. In wine we don't use double buffering on pbuffers but we might be receiving a WGL pixel format which uses double buffering, so in that case the amount of video memory would be a factor 2 too low.
Something like this would give you the double buffering capability from a d3d device: This->adapter->cfgs[iPixelFormat]->doubleBuffer
Roderick
Roderick Colenbrander a écrit :
I tried to play Supreme Commander using pbuffer option instead of fbo. I was quite happy with it, since I gained quite a bunch of performance (I mean, something I really COULD see), but after a while, the performance dropped dramatically, to ~4-5 fps.
I tested quite a few thing, and I finally found that pixel bufers were not taken in account when calculating available texture memory. The game then allocated more textures, and good opengl didn't dare complain when putting them in system memory.
Attached is a patch which should solve the problem.
For those who are curious, try setting VideoMemorySize to 200 instead of 256. It works just like a charm.
I think the basic idea of the patch is good but the calculation itself should take into account double buffering. In wine we don't use double buffering on pbuffers but we might be receiving a WGL pixel format which uses double buffering, so in that case the amount of video memory would be a factor 2 too low.
Something like this would give you the double buffering capability from a d3d device: This->adapter->cfgs[iPixelFormat]->doubleBuffer
Roderick
OK. Here is a new one, more readable I think. I'll send it to wine-patches once I manage to configure git-send-email :)
From 3f20a3570a99a57c3e065f1ef93277c28fb61a4e Mon Sep 17 00:00:00 2001
From: =?utf-8?q?J=C3=A9r=C3=B4me=20Gardou?= jerome.gardou@laposte.net Date: Sun, 25 Jan 2009 12:52:45 +0100 Subject: [PATCH] wined3d: take into account video ram that pixel buffers use.
--- dlls/wined3d/context.c | 14 ++++++++++++++ 1 files changed, 14 insertions(+), 0 deletions(-)
diff --git a/dlls/wined3d/context.c b/dlls/wined3d/context.c index da3053a..86d9db4 100644 --- a/dlls/wined3d/context.c +++ b/dlls/wined3d/context.c @@ -635,6 +635,7 @@ WineD3DContext *CreateContext(IWineD3DDeviceImpl *This, IWineD3DSurfaceImpl *tar HGLRC ctx = NULL, oldCtx; WineD3DContext *ret = NULL; int s; + long size = 0 ;
TRACE("(%p): Creating a %s context for render target %p\n", This, create_pbuffer ? "offscreen" : "onscreen", target);
@@ -679,6 +680,9 @@ WineD3DContext *CreateContext(IWineD3DDeviceImpl *This, IWineD3DSurfaceImpl *tar goto out; } ReleaseDC(win_handle, hdc_parent); + size = target->currentDesc.Width * target->currentDesc.Height * target->bytesPerPixel ; + if(This->adapter->cfgs[iPixelFormat].doubleBuffer) + size *= 2 ; } else { PIXELFORMATDESCRIPTOR pfd; int iPixelFormat; @@ -898,6 +902,8 @@ WineD3DContext *CreateContext(IWineD3DDeviceImpl *This, IWineD3DSurfaceImpl *tar } This->frag_pipe->enable_extension((IWineD3DDevice *) This, TRUE);
+ WineD3DAdapterChangeGLRam(This, size) ; + return ret;
out: @@ -958,6 +964,7 @@ static void RemoveContextFromArray(IWineD3DDeviceImpl *This, WineD3DContext *con *****************************************************************************/ void DestroyContext(IWineD3DDeviceImpl *This, WineD3DContext *context) { struct fbo_entry *entry, *entry2; + long size = 0 ;
TRACE("Destroying ctx %p\n", context);
@@ -986,11 +993,18 @@ void DestroyContext(IWineD3DDeviceImpl *This, WineD3DContext *context) { /* Cleanup the GL context */ pwglMakeCurrent(NULL, NULL); if(context->isPBuffer) { + int iPixelFormat = GetPixelFormat(context->hdc) ; + IWineD3DSurfaceImpl* surf = (IWineD3DSurfaceImpl*) context->surface ; GL_EXTCALL(wglReleasePbufferDCARB(context->pbuffer, context->hdc)); GL_EXTCALL(wglDestroyPbufferARB(context->pbuffer)); + size = surf->currentDesc.Width * surf->currentDesc.Height * surf->bytesPerPixel ; + if(This->adapter->cfgs[iPixelFormat].doubleBuffer) + size *= 2 ; } else ReleaseDC(context->win_handle, context->hdc); pwglDeleteContext(context->glCtx);
+ WineD3DAdapterChangeGLRam(This, (-1)*size) ; + HeapFree(GetProcessHeap(), 0, context->vshader_const_dirty); HeapFree(GetProcessHeap(), 0, context->pshader_const_dirty); RemoveContextFromArray(This, context);
Roderick Colenbrander a écrit :
I tried to play Supreme Commander using pbuffer option instead of fbo.
I
was quite happy with it, since I gained quite a bunch of performance (I mean, something I really COULD see), but after a while, the performance dropped dramatically, to ~4-5 fps.
I tested quite a few thing, and I finally found that pixel bufers were not taken in account when calculating available texture memory. The
game
then allocated more textures, and good opengl didn't dare complain when putting them in system memory.
Attached is a patch which should solve the problem.
For those who are curious, try setting VideoMemorySize to 200 instead
of
- It works just like a charm.
I think the basic idea of the patch is good but the calculation itself
should take into account double buffering. In wine we don't use double buffering on pbuffers but we might be receiving a WGL pixel format which uses double buffering, so in that case the amount of video memory would be a factor 2 too low.
Something like this would give you the double buffering capability from
a d3d device:
This->adapter->cfgs[iPixelFormat]->doubleBuffer
Roderick
OK. Here is a new one, more readable I think. I'll send it to wine-patches once I manage to configure git-send-email :)
Hi,
I think it would be best to add an attribute 'pbuffer_size' or something like that to the WineD3DContext storing the size of the pbuffer. This is needed because GetPixelFormat won't work on a pbuffer. The problem with this call is that GetPixelFormat doesn't see 'offscreen pixel formats' (like e.g. floating point ones and there are others) which can be used for pbuffers. Further I think it is nicer to not have to recalculate its size. I don't think there is a way to get the pixel format from a pbuffer.
Roderick
Roderick Colenbrander a écrit :
Roderick Colenbrander a écrit :
I tried to play Supreme Commander using pbuffer option instead of fbo.
I
was quite happy with it, since I gained quite a bunch of performance (I mean, something I really COULD see), but after a while, the performance dropped dramatically, to ~4-5 fps.
I tested quite a few thing, and I finally found that pixel bufers were not taken in account when calculating available texture memory. The
game
then allocated more textures, and good opengl didn't dare complain when putting them in system memory.
Attached is a patch which should solve the problem.
For those who are curious, try setting VideoMemorySize to 200 instead
of
- It works just like a charm.
I think the basic idea of the patch is good but the calculation itself
should take into account double buffering. In wine we don't use double buffering on pbuffers but we might be receiving a WGL pixel format which uses double buffering, so in that case the amount of video memory would be a factor 2 too low.
Something like this would give you the double buffering capability from
a d3d device:
This->adapter->cfgs[iPixelFormat]->doubleBuffer
Roderick
OK. Here is a new one, more readable I think. I'll send it to wine-patches once I manage to configure git-send-email :)
Hi,
I think it would be best to add an attribute 'pbuffer_size' or something like that to the WineD3DContext storing the size of the pbuffer. This is needed because GetPixelFormat won't work on a pbuffer. The problem with this call is that GetPixelFormat doesn't see 'offscreen pixel formats' (like e.g. floating point ones and there are others) which can be used for pbuffers. Further I think it is nicer to not have to recalculate its size. I don't think there is a way to get the pixel format from a pbuffer.
Roderick
Patch sent...
The patch was sent, but not accepted. Does anyone have a clue about it?
http://www.winehq.org/pipermail/wine-patches/2009-January/068339.html
Am Sonntag, 1. Februar 2009 22:02:15 schrieb Jérôme Gardou:
The patch was sent, but not accepted. Does anyone have a clue about it?
http://www.winehq.org/pipermail/wine-patches/2009-January/068339.html
The vidmem counting is faked, and only counts the size of the D3D objects, not the size the GL objects have.
Does this patch fix any bug? (I have missed the beginning of the thread)
2009/2/2 Stefan Dösinger stefan@codeweavers.com:
Am Sonntag, 1. Februar 2009 22:02:15 schrieb Jérôme Gardou:
The patch was sent, but not accepted. Does anyone have a clue about it?
http://www.winehq.org/pipermail/wine-patches/2009-January/068339.html
The vidmem counting is faked, and only counts the size of the D3D objects, not the size the GL objects have.
Does this patch fix any bug? (I have missed the beginning of the thread)
Original post indicates it fixes a performance issue in Supreme Commander, where it drops to < 10FPS after a while, presumably due to graphics memory being exhausted and system memory being used instead.
Also in the original post is:
2009/1/25 Jérôme Gardou jerome.gardou@gmail.com:
For those who are curious, try setting VideoMemorySize to 200 instead of 256. It works just like a charm.
I'm not sure, because I don't have Supreme Commander, but I assume this is a current work-around for the performance issue (setting VideoMemorySize to about 50MB less than what your card actually has).