Hello! You can get the amount of videoRam from the x-server log (python script attached). This is a little unsafe since the video driver is responsible for logging this info and I don't know if my script works on the format of all drivers. Also, certain drivers (fbdev) don't output this at all. Then again, there is no hardware openGL support for this anyway ;-). Another way to obtain the videoRam would be to use 'lspci -v'. This is however not error prone either since the the the size reported by lspci is not necessarily correct.
Fabian
Aric Cyr wrote:
On 10/8/05, Jonathan Ernst Jonathan@ernstfamily.ch wrote:
Le samedi 08 octobre 2005 à 10:47 +0900, Aric Cyr a écrit :
This is a simple patch to add a registry setting for a user's video RAM which is used by wined3d. Currently emulated_videoram is hardcoded at 64MB, which is not enough for some games and having to recompile wine to change it is a burden for end-users. The new registry key is called HKCU\Software\Wine\Direct3D\VideoRam and is an integer in megabytes.
Hello
I guess it is not possible to retrieve the real value from the system ?
Jonathan,
OpenGL does not have any standard features which allow querying of total video memory. It does allow you to see if a texture allocation would succeed or fail, but that would mean allocating a lot of textures on startup to determine the actual value then deallocating them... not a particularly nice route. If wine already knows this value (which I doubt, since this code wouldn't exist if it did) then we could use that. I'm not familiar with any X calls that would give us what we want either. I'm open to suggestions though... Cedega also does it with a variable like this.
It would be nice to have this setting changeable in winecfg in the future.
Agreed. Since this is a registry key, I assume that it would be pretty easy to add an extra setting in winecfg to configure this. I haven't taken a look at the winecfg code, so I'm not really sure what would be required.
Cheers, Aric
-- Aric Cyr <Aric.Cyr at gmail dot com> (http://acyr.net)
You can get the amount of videoRam from the x-server log (python script attached). This is a little unsafe since the video driver is responsible for logging this info and I don't know if my script works on the format of all drivers. Also, certain drivers (fbdev) don't output this at all. Then again, there is no hardware openGL support for this anyway ;-). Another way to obtain the videoRam would be to use 'lspci -v'. This is however not error prone either since the the the size reported by lspci is not necessarily correct.
The script doesn't work for me, fglrx driver. The videoram line in Xorg.log is
(--) fglrx(0): VideoRAM: 65536 kByte, Type: DDR SGRAM / SDRAM
I don't know how to program python, so I can't fix your script - sorry. Stefan
You can get the amount of videoRam from the x-server log (python script attached). This is a little unsafe since the video driver is responsible for logging this info and I don't know if my script works on the format of all drivers. Also, certain drivers (fbdev) don't output this at all. Then again, there is no hardware openGL support for this anyway ;-). Another way to obtain the videoRam would be to use 'lspci -v'. This is however not error prone either since the the the size reported by lspci is not necessarily correct.
Another update: The reason why the script is not working is the vendor string - it's "Gentoo (The X.Org Foundation 6.8.2, revision r4-0.1.10.2)" for me. Maybe a not so strict check against it would do it.
Stefan
On Sunday 16 October 2005 13:06, Stefan Dösinger wrote:
Another update: The reason why the script is not working is the vendor string - it's "Gentoo (The X.Org Foundation 6.8.2, revision r4-0.1.10.2)" for me. Maybe a not so strict check against it would do it.
This should fix that problem.
Another thing: Would it be better to determine the amount of videoRam when installing wine and storing it in the registry or checking at runtime when GetAvailableTextureMem is called?
Fabian
Hello,
This should fix that problem.
Works nice
Another thing: Would it be better to determine the amount of videoRam when installing wine and storing it in the registry or checking at runtime when GetAvailableTextureMem is called?
I'd suggest to determine it in IWineD3D::CreateDevice and store it in the IWineD3DDeviceImpl class.
How does this script work with multiple graphics cards?
Stefan
Another thing: Would it be better to determine the amount of videoRam when installing
wine
and storing it in the registry or checking at runtime when GetAvailableTextureMem is called?
I'd suggest to determine it in IWineD3D::CreateDevice and store it in the IWineD3DDeviceImpl class.
How does this script work with multiple graphics cards?
Parsing the X log file is way to hacky in my opinion to determine the amount of video memory. Depending on how many X servers you use it can be a different file. In case of nvidia videocards I would prefer to use the NV-CONTROL extension for X. (it is available on Linux/Solaris/FreeBSD) In other cases I would use the pci header of the videocard. A part of it contains what memory ranges are mapped and a part of it corresponds to the size which is mapped for the framebuffer. For instance 128MB on my system:
0000:01:00.0 VGA compatible controller: nVidia Corporation NV35 [GeForce FX 5900] (rev a1) (prog-if 00 [VGA]) Flags: bus master, 66MHz, medium devsel, latency 248, IRQ 11 Memory at de000000 (32-bit, non-prefetchable) [size=16M] Memory at d0000000 (32-bit, prefetchable) [size=128M] Expansion ROM at dfee0000 [disabled] [size=128K] Capabilities: <available only to root>
The only thing is that the pci header can't be thrusted in all cases but I would say it is a lot better than nothing.
Regards, Roderick
As someone stuck with both nVidia and ATI cards, I'd vote against using NV-CONTROL. Detecting the RAM for different manufacturers via different methods sounds harder to maintain.
(--) fglrx(0): VideoRAM: 131072 kByte, Type: DDR SGRAM / SDRAM
Roderick Colenbrander wrote:
Parsing the X log file is way to hacky in my opinion to determine the amount of video memory. Depending on how many X servers you use it can be a different file. In case of nvidia videocards I would prefer to use the NV-CONTROL extension for X. (it is available on Linux/Solaris/FreeBSD) In other cases I would use the pci header of the videocard. A part of it contains what memory ranges are mapped and a part of it corresponds to the size which is mapped for the framebuffer. For instance 128MB on my system:
0000:01:00.0 VGA compatible controller: nVidia Corporation NV35 [GeForce FX 5900] (rev a1) (prog-if 00 [VGA]) Flags: bus master, 66MHz, medium devsel, latency 248, IRQ 11 Memory at de000000 (32-bit, non-prefetchable) [size=16M] Memory at d0000000 (32-bit, prefetchable) [size=128M] Expansion ROM at dfee0000 [disabled] [size=128K] Capabilities: <available only to root>
The only thing is that the pci header can't be thrusted in all cases but I would say it is a lot better than nothing.
Regards, Roderick
As someone stuck with both nVidia and ATI cards, I'd vote against using NV-CONTROL. Detecting the RAM for different manufacturers via different methods sounds harder to maintain.
(--) fglrx(0): VideoRAM: 131072 kByte, Type: DDR SGRAM / SDRAM
In my opinion parsing of the log file is very unreliable and second you aren't 100% sure if the card drawing the opengl stuff is the same as this card. (a X session can be using multiple cards)
When using NV-CONTROL only for use on Nvidia cards you are 100% sure that the detected amount of VideoRam corresponds to the current card rendering the opengl stuff. (NV-CONTROL is a X api and you need to provide it with a X display and screen number; this is the same info you have somewhere in Wine) I prefer to use a generic way but that way doesn't exist. Doubt alexandre would accept a log file parse thing. Not even sure he would accept NV-CONTROL code either but think the chance is quite bigger that he does.
Roderick
Am Sonntag, 16. Oktober 2005 16:52 schrieb Evil:
As someone stuck with both nVidia and ATI cards, I'd vote against using NV-CONTROL. Detecting the RAM for different manufacturers via different methods sounds harder to maintain.
AFAIK there's something similar for the fglrx driver. The fglrx driver contains a control applet which shows the video memory size. This tool is open source, so it's no problem to find out how to do this.
I don't have it installed at the moment, so I can't check
Stefan
Now that I think about it there's some other issue which is partly related to this that should be addressed too namely card detection. Right now wined3d parses sets the card based on several opengl strings. First it detects the vendor (Ati/Nvidia) and then there are two or three models of each brand with which it compares the renderer string (contains 'GeforceFX 5900' and so on). Depending on the brand if the card isn't found (which is the case in most cases) it defaults to a Geforce4Ti / Radeon8500. I know that for apps running through wine the videocard doesn't matter that much compared to windows as roughly the same functionality is supported on recent ati/nvidia cards but some cards use a different 3d backend depending on the gpu that it detects as a TNT supports less features than a Geforce7.
The main issue is to connect OpenGL to lowlevel pci stuff as with VideoRam. In case of a system containing one card this is no issue but in case of multiple cards it is hard to detect which card is used for OpenGL...
Roderick
Hi,
Another thing: Would it be better to determine the amount of videoRam when installing
wine
and storing it in the registry or checking at runtime when GetAvailableTextureMem is called?
I'd suggest to determine it in IWineD3D::CreateDevice and store it in the IWineD3DDeviceImpl class.
How does this script work with multiple graphics cards?
ATI and NVidia both have extensions to X that return the total video memory on the card, other options include using code from the DRI project, and I've got some very old dos code for some Matrox cards. Since there is no standard way to find out the amount of video ram available were going to have to maintain different versions for different cards until the day that there's a standard X extension.
Parsing the X log file is way to hacky in my opinion to determine the amount of video memory. Depending on how many X servers you use it can be a different file. In case of nvidia videocards I would prefer to use the NV-CONTROL extension for X. (it is available on Linux/Solaris/FreeBSD) In other cases I would use the pci header of the videocard. A part of it contains what memory ranges are mapped and a part of it corresponds to the size which is mapped for the framebuffer. For instance 128MB on my system:
0000:01:00.0 VGA compatible controller: nVidia Corporation NV35 [GeForce FX 5900] (rev a1) (prog-if 00 [VGA]) Flags: bus master, 66MHz, medium devsel, latency 248, IRQ 11 Memory at de000000 (32-bit, non-prefetchable) [size=16M] Memory at d0000000 (32-bit, prefetchable) [size=128M] Expansion ROM at dfee0000 [disabled] [size=128K] Capabilities: <available only to root>
The only thing is that the pci header can't be thrusted in all cases but I would say it is a lot better than nothing.
The PCI option isn't a good option, for one the memory mapped doesn't have to relate to the size of the on-board memory if the driver uses paging. I would say that it's worse than nothing, letting the user select how much memory they have installed via winecfg is a better option.
Oliver.
Regards, Roderick
-- NEU: Telefon-Flatrate fürs dt. Festnetz! GMX Phone_Flat: 9,99 Euro/Mon.* Für DSL-Nutzer. Ohne Providerwechsel! http://www.gmx.net/de/go/telefonie
___________________________________________________________ How much free photo storage do you get? Store your holiday snaps for FREE with Yahoo! Photos http://uk.photos.yahoo.com
On Sunday 16 October 2005 17:22, Oliver Stieber wrote:
ATI and NVidia both have extensions to X that return the total video memory on the card, other options include using code from the DRI project, and I've got some very old dos code for some Matrox cards. Since there is no standard way to find out the amount of video ram available were going to have to maintain different versions for different cards until the day that there's a standard X extension.
So, for the nvidia: should I use the output of 'nvidia-settings -q VideoRam' or rather use the nv_control extension directly?
--- Fabian Bieler der.fabe@gmx.net wrote:
On Sunday 16 October 2005 17:22, Oliver Stieber wrote:
ATI and NVidia both have extensions to X that return the total video memory on the card, other options include using code from the DRI project, and I've got some very old dos code for some Matrox cards. Since there is no standard way to find out the amount of video ram available were going to have to maintain different versions for different cards until the day that there's a standard X extension.
So, for the nvidia: should I use the output of 'nvidia-settings -q VideoRam' or rather use the nv_control extension directly?
I would say it's better to use the extension if it exists (I expect all that calling nvidia-settings -q VideoRam will do is to call the extension)
___________________________________________________________ How much free photo storage do you get? Store your holiday snaps for FREE with Yahoo! Photos http://uk.photos.yahoo.com
OK, here I go again: This is a small C program which should get the videoRam using the NV-CONTROL and ATIFGLEXTENSION extensions. As I only have a nVidia card, could someone with an ATI card (and the fglrx driver) please test this?
Fabian
On Sunday 16 October 2005 23:34, Fabian Bieler wrote:
OK, here I go again: This is a small C program which should get the videoRam using the NV-CONTROL and ATIFGLEXTENSION extensions. As I only have a nVidia card, could someone with an ATI card (and the fglrx driver) please test this?
Fabian
this time with one bug less... ;-)
Am Mon, 17 Oct 2005 00:17:22 +0200 schrieb Fabian Bieler der.fabe@gmx.net:
On Sunday 16 October 2005 23:34, Fabian Bieler wrote:
OK, here I go again: This is a small C program which should get the videoRam using the NV-CONTROL and ATIFGLEXTENSION extensions. As I only have a nVidia card, could someone with an ATI card (and the fglrx driver) please test this?
Fabian
this time with one bug less... ;-)
I have tried your program with my ATI card and it says "videoRam: 2048 kBytes". It should be "VideoRAM: 131072 kByte" like Xorg.log says. I think there is a bug in your program.
Lukas
For Nvidia systems at least, the script would probably be unnecessary, as 'nvidia-settings -q VideoRam | grep 0.0 | tail --bytes=8' *should* return the amount of videoram. Would it make more sense to check at runtime if the 'nvidia' driver is loaded and then pass the output from the line above into the registry, only reverting to the script if the system is non-Nvidia?
Just figured I'd throw this out there, I didn't see where anyone had mentioned it as an option yet.
Randall Walls
Stefan Dösinger wrote:
Hello,
This should fix that problem.
Works nice
Another thing: Would it be better to determine the amount of videoRam when installing wine and storing it in the registry or checking at runtime when GetAvailableTextureMem is called?
I'd suggest to determine it in IWineD3D::CreateDevice and store it in the IWineD3DDeviceImpl class.
How does this script work with multiple graphics cards?
Stefan
For Nvidia systems at least, the script would probably be unnecessary, as 'nvidia-settings -q VideoRam | grep 0.0 | tail --bytes=8' *should* return the amount of videoram. Would it make more sense to check at runtime if the 'nvidia' driver is loaded and then pass the output from the line above into the registry, only reverting to the script if the system is non-Nvidia?
Just figured I'd throw this out there, I didn't see where anyone had mentioned it as an option yet.
Randall Walls
Somewhere in this thread (before the last NV-CONTROL mail) I proposed to use it too. Nvidia-settings itself is licensed under the GPL some time before I wrote an alternative NV-CONTROL library which I use in my program (NVClock). I could simplify it and allow it to be used in Wine but as said in my last email connecting OpenGL to more lowlevel info is hard as you don't have much info to 'combine' the two. (for instance the renderer string can't be used to check for the videocard as the device string is useally different, the only thing that can be the same is the vendor name ...)
Roderick