Jesse Allen <the3dfxdude <at> gmail.com> writes:
On 7/14/06, Stefan Dösinger <stefandoesinger <at> gmx.at> wrote:
I tried the sample program on my Linux box (radeon M9, 64 mb vram) and I don't think that it reported correct values. It said 32 mb textures, not all resident. While I have a 1400x1050 resolution I don't think it eats 32 mb vidmem.
Yeah, not at all. Graphics card video ram these days far exceed whats needed for video modes. Just a quick number crunch shows 1400x1050 <at> 32bpp only requires approx 5.6MB. Most of the ram is used for textures, I believe.
For a single-buffered root Xserver yes. But for a 1400x1050@32bpp game, we usually have 24bpp depth buffer, 8bpp sencil plus a double buffer for the framebuffer so we would have something like 16.8MB of video RAM. This is only OpenGL memory, the normal Xserver FB memory is still 5.6MB as you said so we would be at 22.4MB of video RAM gone (I'm pretty sure that the Xserver doesn't/can't share its framebuffer with OpenGL's, correct me if I'm wrong). I haven't looked into that test program yet, so I'm not sure how it calculates "32MB" texture space for you, but it seems a fairly reasonable number for your setup.
I think the worst case at detecting amount of video ram is stuffing it with textures and guessing how much went in.
This is probably the only reasonably accurate and cross-platform way to do it without resorting to hacks or platform-specific methods or graphic-card specific methods. It is unfortunate that OpenGL doesn't portably provide this info directly, but texture stuffing is a work-around, although not necessarily a nice one.
It should be noted that GetAvailableTextureMem() will return the Video+AGP memory, and even MSDN recommends against using it for small-scale allocations. So as long as it is "close" (but less-than-or-equal to the physical max) it should be good enough for most games.
I still say a registry entry is a simple alternative way to go, with a default of 32MB or 64MB. (Didn't someone supply a patch for this a while ago?)
- Aric