Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via WINE. The programs says it does not detect a compatible GPU. Even when the user has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program is not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate the direct3d calls into opengl calls that are then passed on to the GPU. So the question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics driver installed and how the wine one appeared when running WINE. Well this is fine but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then the direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Thanks,
Seth Shelnutt
Hi Seth,
2008/7/3 Seth Shelnutt shelnutt2@gmail.com:
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via WINE. The programs says it does not detect a compatible GPU. Even when the user has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program is not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate the direct3d calls into opengl calls that are then passed on to the GPU. So the question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics driver installed and how the wine one appeared when running WINE. Well this is fine but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then the direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Just wondering, why run the windows one on Wine and not the native linux client?
Cheers, Maarten.
On Fri, Jul 4, 2008 at 1:43 PM, Maarten Lankhorst m.b.lankhorst@gmail.com wrote:
Hi Seth,
2008/7/3 Seth Shelnutt shelnutt2@gmail.com:
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via WINE. The programs says it does not detect a compatible GPU. Even when the user has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program is not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate the direct3d calls into opengl calls that are then passed on to the GPU. So the question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics driver installed and how the wine one appeared when running WINE. Well this is fine but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then the direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Just wondering, why run the windows one on Wine and not the native linux client?
Cheers, Maarten.
The native linux client doesn't support using the GPU.
I had a look, the windows client calls IWineD3DImpl_GetAdapterIdentifier which we don't give the real adapter driver and description strings, just "Display" and "Direct3D HAL" - changing these in dlls/wined3d/directx.c to "NVIDIA GeForce 8800" gets past the error message, but the client needs cudart.dll which isn't distributed with the download.
Jeff
On Fri, Jul 4, 2008 at 4:01 PM, Jeff Zaroyko jeffzaroyko@gmail.com wrote:
I had a look, the windows client calls IWineD3DImpl_GetAdapterIdentifier which we don't give the real adapter driver and description strings, just "Display" and "Direct3D HAL" - changing these in dlls/wined3d/directx.c to "NVIDIA GeForce 8800" gets past the error message, but the client needs cudart.dll which isn't distributed with the download.
Jeff
Ah.. they are but under Documents and Settings\User\Application Data\Folding@home-gpu - but there's not much sense in me attempting to run this, as I don't have the Linux CUDA drivers installed, so it falls back to acting as a regular CPU client.
Jeff
Interesting. So it is not to difficult. Now is there anyway to change these settings without compiling a custom version of WINE? I don't have a problem compiling WINE but I'm sure not ever person out there would want to. I think for now I wil just compile a version and host it for people to try out the client on.
Thanks for the help so far.
On Fri, Jul 4, 2008 at 2:31 AM, Jeff Zaroyko jeffzaroyko@gmail.com wrote:
On Fri, Jul 4, 2008 at 4:01 PM, Jeff Zaroyko jeffzaroyko@gmail.com wrote:
I had a look, the windows client calls IWineD3DImpl_GetAdapterIdentifier which we don't give the real adapter driver and description strings, just "Display" and "Direct3D HAL" - changing these in dlls/wined3d/directx.c to "NVIDIA GeForce 8800" gets past the error message, but the client needs cudart.dll which isn't distributed with the download.
Jeff
Ah.. they are but under Documents and Settings\User\Application Data\Folding@home-gpu - but there's not much sense in me attempting to run this, as I don't have the Linux CUDA drivers installed, so it falls back to acting as a regular CPU client.
Jeff
Actually we have quite a bit of code to tell the app more about the GPU and not just provide a generic wine one. This is needed because some games insist on a proper GPU PCI ID. We don't report and GPU-specific renderer strings yet, but that should be rather easy to add, if you look at the PCI ID reporting code. Currently you have to recompile for that, but you are welcome to send a patch that solves this problem in a generic way and send it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA support at this point. The Windows CUDA DLL will not make you happy, because it talks to the Windows hardware drivers. Thus we need an implementation of this cudart.dll which calls the Linux cuda cudart.so instead. (And then hope it works out)
From: wine-devel-bounces@winehq.org [mailto:wine-devel-bounces@winehq.org] On Behalf Of Seth Shelnutt Sent: Thursday, July 03, 2008 10:24 PM To: wine-devel@winehq.org Subject: Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via WINE. The programs says it does not detect a compatible GPU. Even when the user has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program is not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate the direct3d calls into opengl calls that are then passed on to the GPU. So the question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics driver installed and how the wine one appeared when running WINE. Well this is fine but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then the direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997 http://www.ocforums.com/showpost.php?p=5698997&postcount=19 &postcount=19 .
Thanks,
Seth Shelnutt
Writing a Cuda wrapper itself is likely not that hard but Cuda can interact with OpenGL and Direct3D. In case of OpenGL you can let it write to buffers (VBOs, PBOs) and textures. Likely something similar for Direct3D is possible. Likely Nvidia is doing evil things behind the backs of D3D to make this possible and I'm not sure if we can support this.
Roderick
Actually we have quite a bit of code to tell the app more about the GPU and not just provide a generic wine one. This is needed because some games insist on a proper GPU PCI ID. We don't report and GPU-specific renderer strings yet, but that should be rather easy to add, if you look at the PCI ID reporting code. Currently you have to recompile for that, but you are welcome to send a patch that solves this problem in a generic way and send it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA support at this point. The Windows CUDA DLL will not make you happy, because it talks to the Windows hardware drivers. Thus we need an implementation of this cudart.dll which calls the Linux cuda cudart.so instead. (And then hope it works out)
From: wine-devel-bounces@winehq.org [mailto:wine-devel-bounces@winehq.org] On Behalf Of Seth Shelnutt Sent: Thursday, July 03, 2008 10:24 PM To: wine-devel@winehq.org Subject: Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via WINE. The programs says it does not detect a compatible GPU. Even when the user has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program is not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate the direct3d calls into opengl calls that are then passed on to the GPU. So the question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics driver installed and how the wine one appeared when running WINE. Well this is fine but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then the direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997 http://www.ocforums.com/showpost.php?p=5698997&postcount=19 &postcount=19 .
Thanks,
Seth Shelnutt
What options do I need to change in order to compile WINE with support for the more GPU specific information?
Also when changing the following lines of code in order to change the output of IWineD3DImpl_GetAdapterIdentifier to for now identify it as a 8800 GT with 173 drivers, would the second lines of code be correct? I just want to make sure "driver" actually means "driver" Which would be "nvidia 173.14" and description simply the card correct?
Adapters[0].driver = "Display"; Adapters[0].description = "Direct3D HAL";
Adapters[0].driver = "Nvidia 173.14"; Adapters[0].description = "Nvidia 8800 GT";
Also if this is the case would it not be easy to simply grab the driver version from the xserver, or atleast the xserver would give you the card and brand, Nvidia 8800GT but I am not sure how to get specific driver information. I'm looking for a command but glxinfo is only opengl info, and I've yet to find anything else.
On Fri, Jul 4, 2008 at 8:45 AM, Stefan Dösinger stefan@codeweavers.com wrote:
Actually we have quite a bit of code to tell the app more about the GPU and not just provide a generic wine one. This is needed because some games insist on a proper GPU PCI ID. We don't report and GPU-specific renderer strings yet, but that should be rather easy to add, if you look at the PCI ID reporting code. Currently you have to recompile for that, but you are welcome to send a patch that solves this problem in a generic way and send it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA support at this point. The Windows CUDA DLL will not make you happy, because it talks to the Windows hardware drivers. Thus we need an implementation of this cudart.dll which calls the Linux cuda cudart.so instead. (And then hope it works out)
*From:* wine-devel-bounces@winehq.org [mailto: wine-devel-bounces@winehq.org] *On Behalf Of *Seth Shelnutt *Sent:* Thursday, July 03, 2008 10:24 PM *To:* wine-devel@winehq.org *Subject:* Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via WINE. The programs says it does not detect a compatible GPU. Even when the user has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program is not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate the direct3d calls into opengl calls that are then passed on to the GPU. So the question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics driver installed and how the wine one appeared when running WINE. Well this is fine but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then the direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Thanks,
Seth Shelnutt
On Friday 04 July 2008 09:17:09 pm Seth Shelnutt wrote:
I just want to make sure "driver" actually means "driver" Which would be "nvidia 173.14" and description simply the card correct?
Adapters[0].driver = "Display"; Adapters[0].description = "Direct3D HAL"; Adapters[0].driver = "Nvidia 173.14"; Adapters[0].description = "Nvidia 8800 GT";
IIRC, the 'driver' portion, when I looked on Windows, was the actual DLL used for hardware (some nv*.dll for nVidia). For the second, I have the description set to a duplicate of what glString(GL_RENDERER); returns, in my personal tree. Some people aren't keen on that change though because of the concern that some apps may make inappropriate assumptions based on the description string.
The only thing we aren't setting are proper strings for the rest we show an nvidia driver version number and card when an nvidia board is detected. Though we don't show the actual card in various cases as we make an estimation based on glxinfo output and some other things. This is because we can't get pci ids from X, relying on e.g. /proc/bus/pci/devices is not the way to go.
Roderick
Datum: Sat, 5 Jul 2008 00:17:09 -0400 Von: "Seth Shelnutt" shelnutt2@gmail.com An: wine-devel@winehq.org Betreff: Re: Tricking program into seeing actual gfx driver not WINE's
What options do I need to change in order to compile WINE with support for the more GPU specific information?
Also when changing the following lines of code in order to change the output of IWineD3DImpl_GetAdapterIdentifier to for now identify it as a 8800 GT with 173 drivers, would the second lines of code be correct? I just want to make sure "driver" actually means "driver" Which would be "nvidia 173.14" and description simply the card correct?
Adapters[0].driver = "Display"; Adapters[0].description = "Direct3D HAL"; Adapters[0].driver = "Nvidia 173.14"; Adapters[0].description = "Nvidia 8800 GT";
Also if this is the case would it not be easy to simply grab the driver version from the xserver, or atleast the xserver would give you the card and brand, Nvidia 8800GT but I am not sure how to get specific driver information. I'm looking for a command but glxinfo is only opengl info, and I've yet to find anything else.
On Fri, Jul 4, 2008 at 8:45 AM, Stefan Dösinger stefan@codeweavers.com wrote:
Actually we have quite a bit of code to tell the app more about the GPU and not just provide a generic wine one. This is needed because some
games
insist on a proper GPU PCI ID. We don't report and GPU-specific renderer strings yet, but that should be rather easy to add, if you look at the
PCI
ID reporting code. Currently you have to recompile for that, but you are welcome to send a patch that solves this problem in a generic way and
send
it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA support
at
this point. The Windows CUDA DLL will not make you happy, because it
talks
to the Windows hardware drivers. Thus we need an implementation of this cudart.dll which calls the Linux cuda cudart.so instead. (And then hope
it
works out)
*From:* wine-devel-bounces@winehq.org [mailto: wine-devel-bounces@winehq.org] *On Behalf Of *Seth Shelnutt *Sent:* Thursday, July 03, 2008 10:24 PM *To:* wine-devel@winehq.org *Subject:* Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via
WINE.
The programs says it does not detect a compatible GPU. Even when the
user
has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the program
is
not told that there is a Nvidia 8800 installed, instead by the nature of WINE it see that "WINE" is the graphics card, as WINE first translate
the
direct3d calls into opengl calls that are then passed on to the GPU. So
the
question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics
driver
installed and how the wine one appeared when running WINE. Well this is
fine
but what we need is a way to make the program to see that it is actually running on Nvidia hardware. Because if the client would just start then
the
direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card detected, http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Thanks,
Seth Shelnutt
Why only when an Nvidia board is detected? Should it not be possible to detect and display a driver version and card regardless of motherboard? Also is this support built into WINE or do you have a set of patches to enable it?
On Sat, Jul 5, 2008 at 3:38 AM, Roderick Colenbrander thunderbird2k@gmx.net wrote:
The only thing we aren't setting are proper strings for the rest we show an nvidia driver version number and card when an nvidia board is detected. Though we don't show the actual card in various cases as we make an estimation based on glxinfo output and some other things. This is because we can't get pci ids from X, relying on e.g. /proc/bus/pci/devices is not the way to go.
Roderick
Datum: Sat, 5 Jul 2008 00:17:09 -0400 Von: "Seth Shelnutt" shelnutt2@gmail.com An: wine-devel@winehq.org Betreff: Re: Tricking program into seeing actual gfx driver not WINE's
What options do I need to change in order to compile WINE with support
for
the more GPU specific information?
Also when changing the following lines of code in order to change the output of IWineD3DImpl_GetAdapterIdentifier to for now identify it as a 8800 GT with 173 drivers, would the second lines of code be correct? I just want to make sure "driver" actually means "driver" Which would be "nvidia 173.14" and description simply the card correct?
Adapters[0].driver = "Display"; Adapters[0].description = "Direct3D HAL"; Adapters[0].driver = "Nvidia 173.14"; Adapters[0].description = "Nvidia 8800 GT";
Also if this is the case would it not be easy to simply grab the driver version from the xserver, or atleast the xserver would give you the card and brand, Nvidia 8800GT but I am not sure how to get specific driver information. I'm looking for a command but glxinfo is only opengl info, and I've yet to find anything else.
On Fri, Jul 4, 2008 at 8:45 AM, Stefan Dösinger stefan@codeweavers.com wrote:
Actually we have quite a bit of code to tell the app more about the
GPU
and not just provide a generic wine one. This is needed because some
games
insist on a proper GPU PCI ID. We don't report and GPU-specific
renderer
strings yet, but that should be rather easy to add, if you look at the
PCI
ID reporting code. Currently you have to recompile for that, but you
are
welcome to send a patch that solves this problem in a generic way and
send
it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA
support
at
this point. The Windows CUDA DLL will not make you happy, because it
talks
to the Windows hardware drivers. Thus we need an implementation of this cudart.dll which calls the Linux cuda cudart.so instead. (And then hope
it
works out)
*From:* wine-devel-bounces@winehq.org [mailto: wine-devel-bounces@winehq.org] *On Behalf Of *Seth Shelnutt *Sent:* Thursday, July 03, 2008 10:24 PM *To:* wine-devel@winehq.org *Subject:* Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the latest version of Stanford's Folding at Home GPU client to work in Linux via
WINE.
The programs says it does not detect a compatible GPU. Even when the
user
has installed the correct Nvidia drivers (with CUDA support) and has a compatible GPU. The problem I believe lies in the fact that the
program
is
not told that there is a Nvidia 8800 installed, instead by the nature
of
WINE it see that "WINE" is the graphics card, as WINE first translate
the
direct3d calls into opengl calls that are then passed on to the GPU. So
the
question is, is it possible to trick programs into believing they are running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see how many people run steam via WINE. This was done by noting the graphics
driver
installed and how the wine one appeared when running WINE. Well this is
fine
but what we need is a way to make the program to see that it is
actually
running on Nvidia hardware. Because if the client would just start then
the
direct3d calls can be translated into opengl calls and the Nvidia linux drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card
detected,
http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Thanks,
Seth Shelnutt
-- Psssst! Schon das coole Video vom GMX MultiMessenger gesehen? Der Eine für Alle: http://www.gmx.net/de/go/messenger03
There is similar code for ATI and other vendors as well. First we do a selection based on vendor and then we check what d3d version a card supports by checking what opengl extensions are offered and then we try to match it to a card in our database based on the opengl renderer string. This way when e.g. a Geforce11 arrives it will be seen as lets say a geforce7 instead of an unknown card.
Most of the time apps only look for a driver version and the pci id (vendor / device id) and those are reported by wine. The driver and description string are generic. Upto now this makes most apps happy. The problem is we aren't the real windows direct3d drivers so you could reason that reporting that we are is wrong but on the other hand some games need it to run. Further some other games use the info to work around driver bugs which again which don't have as we aren't the windows driver.
Roderick
Why only when an Nvidia board is detected? Should it not be possible to detect and display a driver version and card regardless of motherboard? Also is this support built into WINE or do you have a set of patches to enable it?
On Sat, Jul 5, 2008 at 3:38 AM, Roderick Colenbrander thunderbird2k@gmx.net wrote:
The only thing we aren't setting are proper strings for the rest we show
an
nvidia driver version number and card when an nvidia board is detected. Though we don't show the actual card in various cases as we make an estimation based on glxinfo output and some other things. This is
because we
can't get pci ids from X, relying on e.g. /proc/bus/pci/devices is not
the
way to go.
Roderick
Datum: Sat, 5 Jul 2008 00:17:09 -0400 Von: "Seth Shelnutt" shelnutt2@gmail.com An: wine-devel@winehq.org Betreff: Re: Tricking program into seeing actual gfx driver not
WINE's
What options do I need to change in order to compile WINE with support
for
the more GPU specific information?
Also when changing the following lines of code in order to change the output of IWineD3DImpl_GetAdapterIdentifier to for now identify it as a 8800
GT
with 173 drivers, would the second lines of code be correct? I just
want
to make sure "driver" actually means "driver" Which would be "nvidia
173.14"
and description simply the card correct?
Adapters[0].driver = "Display"; Adapters[0].description = "Direct3D HAL"; Adapters[0].driver = "Nvidia 173.14"; Adapters[0].description = "Nvidia 8800 GT";
Also if this is the case would it not be easy to simply grab the
driver
version from the xserver, or atleast the xserver would give you the
card
and brand, Nvidia 8800GT but I am not sure how to get specific driver information. I'm looking for a command but glxinfo is only opengl
info,
and I've yet to find anything else.
On Fri, Jul 4, 2008 at 8:45 AM, Stefan Dösinger
wrote:
Actually we have quite a bit of code to tell the app more about the
GPU
and not just provide a generic wine one. This is needed because some
games
insist on a proper GPU PCI ID. We don't report and GPU-specific
renderer
strings yet, but that should be rather easy to add, if you look at
the
PCI
ID reporting code. Currently you have to recompile for that, but you
are
welcome to send a patch that solves this problem in a generic way
and
send
it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA
support
at
this point. The Windows CUDA DLL will not make you happy, because it
talks
to the Windows hardware drivers. Thus we need an implementation of
this
cudart.dll which calls the Linux cuda cudart.so instead. (And then
hope
it
works out)
*From:* wine-devel-bounces@winehq.org [mailto: wine-devel-bounces@winehq.org] *On Behalf Of *Seth Shelnutt *Sent:* Thursday, July 03, 2008 10:24 PM *To:* wine-devel@winehq.org *Subject:* Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the
latest
version of Stanford's Folding at Home GPU client to work in Linux
via
WINE.
The programs says it does not detect a compatible GPU. Even when the
user
has installed the correct Nvidia drivers (with CUDA support) and
has a
compatible GPU. The problem I believe lies in the fact that the
program
is
not told that there is a Nvidia 8800 installed, instead by the
nature
of
WINE it see that "WINE" is the graphics card, as WINE first
translate
the
direct3d calls into opengl calls that are then passed on to the GPU.
So
the
question is, is it possible to trick programs into believing they
are
running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see
how
many people run steam via WINE. This was done by noting the graphics
driver
installed and how the wine one appeared when running WINE. Well this
is
fine
but what we need is a way to make the program to see that it is
actually
running on Nvidia hardware. Because if the client would just start
then
the
direct3d calls can be translated into opengl calls and the Nvidia
linux
drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card
detected,
http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Thanks,
Seth Shelnutt
-- Psssst! Schon das coole Video vom GMX MultiMessenger gesehen? Der Eine für Alle: http://www.gmx.net/de/go/messenger03