There is similar code for ATI and other vendors as well. First we do a selection based on vendor and then we check what d3d version a card supports by checking what opengl extensions are offered and then we try to match it to a card in our database based on the opengl renderer string. This way when e.g. a Geforce11 arrives it will be seen as lets say a geforce7 instead of an unknown card.
Most of the time apps only look for a driver version and the pci id (vendor / device id) and those are reported by wine. The driver and description string are generic. Upto now this makes most apps happy. The problem is we aren't the real windows direct3d drivers so you could reason that reporting that we are is wrong but on the other hand some games need it to run. Further some other games use the info to work around driver bugs which again which don't have as we aren't the windows driver.
Roderick
Why only when an Nvidia board is detected? Should it not be possible to detect and display a driver version and card regardless of motherboard? Also is this support built into WINE or do you have a set of patches to enable it?
On Sat, Jul 5, 2008 at 3:38 AM, Roderick Colenbrander thunderbird2k@gmx.net wrote:
The only thing we aren't setting are proper strings for the rest we show
an
nvidia driver version number and card when an nvidia board is detected. Though we don't show the actual card in various cases as we make an estimation based on glxinfo output and some other things. This is
because we
can't get pci ids from X, relying on e.g. /proc/bus/pci/devices is not
the
way to go.
Roderick
Datum: Sat, 5 Jul 2008 00:17:09 -0400 Von: "Seth Shelnutt" shelnutt2@gmail.com An: wine-devel@winehq.org Betreff: Re: Tricking program into seeing actual gfx driver not
WINE's
What options do I need to change in order to compile WINE with support
for
the more GPU specific information?
Also when changing the following lines of code in order to change the output of IWineD3DImpl_GetAdapterIdentifier to for now identify it as a 8800
GT
with 173 drivers, would the second lines of code be correct? I just
want
to make sure "driver" actually means "driver" Which would be "nvidia
173.14"
and description simply the card correct?
Adapters[0].driver = "Display"; Adapters[0].description = "Direct3D HAL"; Adapters[0].driver = "Nvidia 173.14"; Adapters[0].description = "Nvidia 8800 GT";
Also if this is the case would it not be easy to simply grab the
driver
version from the xserver, or atleast the xserver would give you the
card
and brand, Nvidia 8800GT but I am not sure how to get specific driver information. I'm looking for a command but glxinfo is only opengl
info,
and I've yet to find anything else.
On Fri, Jul 4, 2008 at 8:45 AM, Stefan Dösinger
wrote:
Actually we have quite a bit of code to tell the app more about the
GPU
and not just provide a generic wine one. This is needed because some
games
insist on a proper GPU PCI ID. We don't report and GPU-specific
renderer
strings yet, but that should be rather easy to add, if you look at
the
PCI
ID reporting code. Currently you have to recompile for that, but you
are
welcome to send a patch that solves this problem in a generic way
and
send
it to wine-patches.
The more troublesome problem is that Wine does not have any CUDA
support
at
this point. The Windows CUDA DLL will not make you happy, because it
talks
to the Windows hardware drivers. Thus we need an implementation of
this
cudart.dll which calls the Linux cuda cudart.so instead. (And then
hope
it
works out)
*From:* wine-devel-bounces@winehq.org [mailto: wine-devel-bounces@winehq.org] *On Behalf Of *Seth Shelnutt *Sent:* Thursday, July 03, 2008 10:24 PM *To:* wine-devel@winehq.org *Subject:* Tricking program into seeing actual gfx driver not WINE's
Hello All,
We have run into an interesting problem while trying to get the
latest
version of Stanford's Folding at Home GPU client to work in Linux
via
WINE.
The programs says it does not detect a compatible GPU. Even when the
user
has installed the correct Nvidia drivers (with CUDA support) and
has a
compatible GPU. The problem I believe lies in the fact that the
program
is
not told that there is a Nvidia 8800 installed, instead by the
nature
of
WINE it see that "WINE" is the graphics card, as WINE first
translate
the
direct3d calls into opengl calls that are then passed on to the GPU.
So
the
question is, is it possible to trick programs into believing they
are
running on the right hardware? (As in fact they are).
I remember a while ago the steam system spec survey was used to see
how
many people run steam via WINE. This was done by noting the graphics
driver
installed and how the wine one appeared when running WINE. Well this
is
fine
but what we need is a way to make the program to see that it is
actually
running on Nvidia hardware. Because if the client would just start
then
the
direct3d calls can be translated into opengl calls and the Nvidia
linux
drivers can then handle them and run it all fine and dandy.
Here is the post, with error message about wrong graphics card
detected,
http://www.ocforums.com/showpost.php?p=5698997&postcount=19 .
Thanks,
Seth Shelnutt
-- Psssst! Schon das coole Video vom GMX MultiMessenger gesehen? Der Eine für Alle: http://www.gmx.net/de/go/messenger03