--- Raphael fenix@club-internet.fr wrote:
There's a couple of other places that report the amount of video memory, GetAvaialbleVideoMemory in d3d, and static DDHALINFO hal_info in X11ddraw.c
Is there somewhere where we can centralise the possibly tracking of video memory and read the initial setting from either a configuration file or the video cards i2c or registers if available instead of hard coding in three+ different places?
I've got some agpgart code for detecting AGP memory usage and the DRI guys will probably have code for getting the 'real' video memory figures for some cards.
Hi,
Changelog:
- set default video memory to 64Mb (was 16Mb) as
many d3d9 demos use d3d7 code to get available video memory size (ex ConfigSystem.exe on d3d9 sdk)
Todo:
- autodetect this value (better than hardcoded) and
share it using wined3d
Regards, Raphael
Index: main.c
===================================================================
RCS file: /home/wine/wine/dlls/ddraw/ddraw/main.c,v retrieving revision 1.58 diff -u -r1.58 main.c --- main.c 9 Jan 2005 17:35:44 -0000 1.58 +++ main.c 27 Jan 2005 00:22:42 -0000 @@ -118,7 +118,7 @@ /* This is for the moment here... */ This->free_memory = free_memory; This->allocate_memory = allocate_memory;
- This->total_vidmem = 16 * 1024 * 1024;
This->total_vidmem = 64 * 1024 * 1024; This->available_vidmem = This->total_vidmem;
return DD_OK;
ATTACHMENT part 2 application/pgp-signature
___________________________________________________________ ALL-NEW Yahoo! Messenger - all new features - even more fun! http://uk.messenger.yahoo.com
Is there somewhere where we can centralise the possibly tracking of video memory and read the initial setting from either a configuration file or the video cards i2c or registers if available instead of hard coding in three+ different places?
While I agree that we should store in a centralized place the amount we report to the application to be sure to be coherent, I wonder if taking the trouble to report the exact amount we have is really needed (as the memory management of GL and D3D9 differ anyway so the memory usage pattern may be completely different between both worlds)...
Lionel
While I agree that we should store in a centralized place the amount we report to the application to be sure to be coherent, I wonder if taking the trouble to report the exact amount we have is really needed (as the memory management of GL and D3D9 differ anyway so the memory usage pattern may be completely different between both worlds)...
Lionel
Well, tracking approximate usage is quite important because some games and possibly applications will allocate textures until they run out of video memory. Under OpenGL we would have to run out of system memory and die if we didn't track memory usage.
As a example lets same I'm a game and I've got a working set of 1000 textures all mipmapped, if I know that the system only has 32 meg of memory then I can drop the high level mipmaps so that all 1000 textures fit into ram, If I don't know how much memory I have then I'm going to keep the high level mipmaps which may push some of the textures into swap space.
It would be nice to have a semi-automatic system instead of making the user setup there video card size, but I don't see why we should force people who want to run 3DStudio max of a 2mb card not to be able to because there cards only got 2mb of ram.
It's not a 'huge' amount of work to find out how much memory a graphics card has under linux, it should just be a few lines to interface with the kernel module and read a register on the card, but this may not be portable ,their are a lot of graphics cards out their and kernel modules have a habit of changing.
It's also very easy to retrieve the correct AGP memory (and other stats) under linux and I don't see why we shouldn't be reporting correct information if we can (unless it's the ammout of space free on C drive when I have c:\program files as a symlink)
___________________________________________________________ ALL-NEW Yahoo! Messenger - all new features - even more fun! http://uk.messenger.yahoo.com
--- Lionel Ulmer lionel.ulmer@free.fr wrote:
Is there somewhere where we can centralise the possibly tracking of video memory and read the
initial
setting from either a configuration file or the
video
cards i2c or registers if available instead of
hard
coding in three+ different places?
While I agree that we should store in a centralized place the amount we report to the application to be sure to be coherent, I wonder if taking the trouble to report the exact amount we have is really needed (as the memory management of GL and D3D9 differ anyway so the memory usage pattern may be completely different between both worlds)...
Lionel
Now that I've actully read you email properly, what I mean to say is:
I've tried using glPrioritize to forge openGL to load the textures into GPU ram but when I asked OpenGL how many were in ram it said that none of them were.
So, instead I have a global variable that I access through globalAdjustGLMemory that I use to track how must memory DirectX would be expecting the textures to be using.
I only track POOL_DEFAULT textures at the moment, POOL_MANAGED is just like OpenGL anyway so I don't bother tracking it and POOL_SYSTEM and POOL_SCRATCH should never be on the GPU, this works fine for Axis and allies and gets halflife past the allocate until false stage.
I don't send the information off to ddraw or the X11 driver or anywhere else at the moment, so ddraw, the X11 driver and DirectX will all report different things, which could upset some applications, and confuse the user as to why dxdiag says he's got 16Meg, halflife says he's got 256 and Prince of persia fails with 0 memory.
___________________________________________________________ ALL-NEW Yahoo! Messenger - all new features - even more fun! http://uk.messenger.yahoo.com
So, instead I have a global variable that I access through globalAdjustGLMemory that I use to track how must memory DirectX would be expecting the textures to be using.
Yeah, this sounds like the hack there is in DDraw to make SS2 happy (basically, it allocated textures and checked that the space left was diminishing between calls and aborted if not).
The only problem now is to get the initial value on all supported cards in Linux (first to get the available space on the GPU and then, harder, the available free space once you have counter all the memory that X11 eats 'naturally' before even GL has started).
Lionel
--- Lionel Ulmer lionel.ulmer@free.fr wrote:
So, instead I have a global variable that I access through globalAdjustGLMemory that I use to track
how
must memory DirectX would be expecting the
textures to
be using.
Yeah, this sounds like the hack there is in DDraw to make SS2 happy (basically, it allocated textures and checked that the space left was diminishing between calls and aborted if not).
The only problem now is to get the initial value on all supported cards in Linux (first to get the available space on the GPU and then, harder, the available free space once you have counter all the memory that X11 eats 'naturally' before even GL has started).
Lionel
I suggest putting something in winereg for starters then it can be filled in optionally on startup if wine knows how to get the correct information for the card.
Does the regestry code have enough performance to enable us to track the memory stutus there it their or should we put it somewhere else?
___________________________________________________________ ALL-NEW Yahoo! Messenger - all new features - even more fun! http://uk.messenger.yahoo.com