Hello! You can get the amount of videoRam from the x-server log (python script attached). This is a little unsafe since the video driver is responsible for logging this info and I don't know if my script works on the format of all drivers. Also, certain drivers (fbdev) don't output this at all. Then again, there is no hardware openGL support for this anyway ;-). Another way to obtain the videoRam would be to use 'lspci -v'. This is however not error prone either since the the the size reported by lspci is not necessarily correct.
Fabian
Aric Cyr wrote:
The script doesn't work for me, fglrx driver. The videoram line in Xorg.log is
(--) fglrx(0): VideoRAM: 65536 kByte, Type: DDR SGRAM / SDRAM
I don't know how to program python, so I can't fix your script - sorry. Stefan
Another update: The reason why the script is not working is the vendor string - it's "Gentoo (The X.Org Foundation 6.8.2, revision r4-0.1.10.2)" for me. Maybe a not so strict check against it would do it.
Stefan
On Sunday 16 October 2005 13:06, Stefan Dösinger wrote:
This should fix that problem.
Another thing: Would it be better to determine the amount of videoRam when installing wine and storing it in the registry or checking at runtime when GetAvailableTextureMem is called?
Fabian
Parsing the X log file is way to hacky in my opinion to determine the amount of video memory. Depending on how many X servers you use it can be a different file. In case of nvidia videocards I would prefer to use the NV-CONTROL extension for X. (it is available on Linux/Solaris/FreeBSD) In other cases I would use the pci header of the videocard. A part of it contains what memory ranges are mapped and a part of it corresponds to the size which is mapped for the framebuffer. For instance 128MB on my system:
0000:01:00.0 VGA compatible controller: nVidia Corporation NV35 [GeForce FX 5900] (rev a1) (prog-if 00 [VGA]) Flags: bus master, 66MHz, medium devsel, latency 248, IRQ 11 Memory at de000000 (32-bit, non-prefetchable) [size=16M] Memory at d0000000 (32-bit, prefetchable) [size=128M] Expansion ROM at dfee0000 [disabled] [size=128K] Capabilities: <available only to root>
The only thing is that the pci header can't be thrusted in all cases but I would say it is a lot better than nothing.
Regards, Roderick
As someone stuck with both nVidia and ATI cards, I'd vote against using NV-CONTROL. Detecting the RAM for different manufacturers via different methods sounds harder to maintain.
(--) fglrx(0): VideoRAM: 131072 kByte, Type: DDR SGRAM / SDRAM
Roderick Colenbrander wrote:
In my opinion parsing of the log file is very unreliable and second you aren't 100% sure if the card drawing the opengl stuff is the same as this card. (a X session can be using multiple cards)
When using NV-CONTROL only for use on Nvidia cards you are 100% sure that the detected amount of VideoRam corresponds to the current card rendering the opengl stuff. (NV-CONTROL is a X api and you need to provide it with a X display and screen number; this is the same info you have somewhere in Wine) I prefer to use a generic way but that way doesn't exist. Doubt alexandre would accept a log file parse thing. Not even sure he would accept NV-CONTROL code either but think the chance is quite bigger that he does.
Roderick
Am Sonntag, 16. Oktober 2005 16:52 schrieb Evil:
AFAIK there's something similar for the fglrx driver. The fglrx driver contains a control applet which shows the video memory size. This tool is open source, so it's no problem to find out how to do this.
I don't have it installed at the moment, so I can't check
Stefan
Now that I think about it there's some other issue which is partly related to this that should be addressed too namely card detection. Right now wined3d parses sets the card based on several opengl strings. First it detects the vendor (Ati/Nvidia) and then there are two or three models of each brand with which it compares the renderer string (contains 'GeforceFX 5900' and so on). Depending on the brand if the card isn't found (which is the case in most cases) it defaults to a Geforce4Ti / Radeon8500. I know that for apps running through wine the videocard doesn't matter that much compared to windows as roughly the same functionality is supported on recent ati/nvidia cards but some cards use a different 3d backend depending on the gpu that it detects as a TNT supports less features than a Geforce7.
The main issue is to connect OpenGL to lowlevel pci stuff as with VideoRam. In case of a system containing one card this is no issue but in case of multiple cards it is hard to detect which card is used for OpenGL...
Roderick
Hi,
ATI and NVidia both have extensions to X that return the total video memory on the card, other options include using code from the DRI project, and I've got some very old dos code for some Matrox cards. Since there is no standard way to find out the amount of video ram available were going to have to maintain different versions for different cards until the day that there's a standard X extension.
The PCI option isn't a good option, for one the memory mapped doesn't have to relate to the size of the on-board memory if the driver uses paging. I would say that it's worse than nothing, letting the user select how much memory they have installed via winecfg is a better option.
Oliver.
___________________________________________________________ How much free photo storage do you get? Store your holiday snaps for FREE with Yahoo! Photos http://uk.photos.yahoo.com
On Sunday 16 October 2005 17:22, Oliver Stieber wrote:
So, for the nvidia: should I use the output of 'nvidia-settings -q VideoRam' or rather use the nv_control extension directly?
--- Fabian Bieler der.fabe@gmx.net wrote:
I would say it's better to use the extension if it exists (I expect all that calling nvidia-settings -q VideoRam will do is to call the extension)
___________________________________________________________ How much free photo storage do you get? Store your holiday snaps for FREE with Yahoo! Photos http://uk.photos.yahoo.com
OK, here I go again: This is a small C program which should get the videoRam using the NV-CONTROL and ATIFGLEXTENSION extensions. As I only have a nVidia card, could someone with an ATI card (and the fglrx driver) please test this?
Fabian
Am Mon, 17 Oct 2005 00:17:22 +0200 schrieb Fabian Bieler der.fabe@gmx.net:
I have tried your program with my ATI card and it says "videoRam: 2048 kBytes". It should be "VideoRAM: 131072 kByte" like Xorg.log says. I think there is a bug in your program.
Lukas
For Nvidia systems at least, the script would probably be unnecessary, as 'nvidia-settings -q VideoRam | grep 0.0 | tail --bytes=8' *should* return the amount of videoram. Would it make more sense to check at runtime if the 'nvidia' driver is loaded and then pass the output from the line above into the registry, only reverting to the script if the system is non-Nvidia?
Just figured I'd throw this out there, I didn't see where anyone had mentioned it as an option yet.
Randall Walls
Stefan Dösinger wrote:
Somewhere in this thread (before the last NV-CONTROL mail) I proposed to use it too. Nvidia-settings itself is licensed under the GPL some time before I wrote an alternative NV-CONTROL library which I use in my program (NVClock). I could simplify it and allow it to be used in Wine but as said in my last email connecting OpenGL to more lowlevel info is hard as you don't have much info to 'combine' the two. (for instance the renderer string can't be used to check for the videocard as the device string is useally different, the only thing that can be the same is the vendor name ...)
Roderick