Felix Nawothnig wrote:
I suggested to do all the work server-side a while back:
http://www.winehq.org/pipermail/wine-devel/2005-July/038695.html
Especially see:
http://www.winehq.org/pipermail/wine-devel/2005-July/038703.html
Well, this would mostly mirror the Windows architecture nowadays where almost all of this is in win32k.sys and GDI32 is mostly a user space wrapper around that. A bit like ntdll does for kernel operations.
But I would be a bit concerned about performance if most calls have to be passed to the server side to be handled but maybe that is not really an issue? Didn't MS put part of the GDI somehow back in user space when going from NT4 to 2K just because of performance concerns?
Would still remain the question where and how to hook this into dib handling but I think GDI would be better. Basically call the dc driver and if that fails branch into the local dibeng API for apropriate functions.
Rolf Kalbermatter
Am Samstag 10 Februar 2007 09:49 schrieb Rolf Kalbermatter:
Felix Nawothnig wrote:
I suggested to do all the work server-side a while back:
http://www.winehq.org/pipermail/wine-devel/2005-July/038695.html
Especially see:
http://www.winehq.org/pipermail/wine-devel/2005-July/038703.html
Well, this would mostly mirror the Windows architecture nowadays where almost all of this is in win32k.sys and GDI32 is mostly a user space wrapper around that. A bit like ntdll does for kernel operations.
But I would be a bit concerned about performance if most calls have to be passed to the server side to be handled but maybe that is not really an issue? Didn't MS put part of the GDI somehow back in user space when going from NT4 to 2K just because of performance concerns?
I think performance could be ok if we use shared memory to get the dibs in and out of the server, but I think Alexandre does not like shmmem for the one or other reason. Afaik MS had GDI in user space in NT 3.5, but for performance reasons they put it into the kernel in 4.0 and win95. But I think for windows some considerations regarding hardware acceleration also apply.
If I understand things correctly, performance problems occur if a DIB section is switched from server mode to app mode, or vice versa. As long as it stays in one or the other mode it is fine. DirectDraw is a much simpler interface than GDI. DDraw currently draws mainly in appmod, problems occur if GetDC is used on a surface. This could maybe avoided if the ddraw implementation was changed to use GDI for blitting too, instead of its own code. However, problems will occur if the application Locks the surface. So I think this would only shift the problem to other apps, and it would not help Direct3D apps, or opengl accelerated DirectDraw.
This is probably a question for Roderick or Stefan or anybody that is familiar with the opengl code in wine.
It could be just a coincidence, however it seems that most users that have the problem where WoW hangs after login in, seem to have graphics cards with 64MB memory. Is it possible that the opengl code may not handle a condition where memory runs out on the graphics card and generates the following error ..
/The instruction at '0x00000000' referenced memory at '0x00000000'. The memory could not be "read"./
Does the opengl code test for such a condition or does it assume there is an infinite amount of graphics card memory and the application should test for this anyway ?
Regards Nick Law
On Saturday 10 February 2007 06:24:04 am Nick Law wrote:
It could be just a coincidence, however it seems that most users that have the problem where WoW hangs after login in, seem to have graphics cards with 64MB memory. Is it possible that the opengl code may not handle a condition where memory runs out on the graphics card and generates the following error ..
/The instruction at '0x00000000' referenced memory at '0x00000000'. The memory could not be "read"./
Does the opengl code test for such a condition or does it assume there is an infinite amount of graphics card memory and the application should test for this anyway ?
AFAIK, OpenGL doesn't really care how much video memory you have. If you run out, it'll just start hapilly swapping to your system RAM automatically (with a nice performance hit). If your system RAM runs out, then you can have problems.
That error, though, looks like its trying to use an extension that the driver/card doesn't fully support. It gets a NULL pointer for some function, tries to call it without checking, and you get a nice NULL pointer dereference.
Am Samstag 10 Februar 2007 15:44 schrieb Chris Robinson:
AFAIK, OpenGL doesn't really care how much video memory you have. If you run out, it'll just start hapilly swapping to your system RAM automatically (with a nice performance hit). If your system RAM runs out, then you can have problems.
Yes, opengl does not have the concept of video memory like DirectX has. OpenGL implementations can work without video memory, for example mesa software rendering, or rendering using a remote opengl server.
On implementations with video memory, the opengl driver has to take care of that. The way it does that is transparent to wine and the applications.
That error, though, looks like its trying to use an extension that the driver/card doesn't fully support. It gets a NULL pointer for some function, tries to call it without checking, and you get a nice NULL pointer dereference.
It doesn't have to be caused by opengl at all.