On Wed, Jul 21, 2010 at 4:02 AM, Henri Verbeet hverbeet@gmail.com wrote:
On 21 July 2010 05:08, Roderick Colenbrander thunderbird2k@gmail.com wrote:
I think 'select_card_*' should just return the pci device id and the default amount of video memory should be stored in lets say the driver version table (a different name for the table might make sense). It
Yeah, this should be based on the card id, like the description string. You can probably also do much of the gl_renderer matching based on a table. Ultimately you may want to move it out of the source code completely, into a file, or perhaps the registry. The card db could be updated independent of Wine itself then, though at that point you'll have to start worrying about compatibility between Wine versions.
Also note that while accurately detecting the total amount of video memory is nice, GetAvailableTextureMem() is probably at least as important.
Sure, GL_*_memory_info extensions could help there but I think we should be cautious with just adding such support to there. Over the past few months I have seen various reports of users who run games which say to require lets say 128MB of video memory and they just don't work on Wine if you don't select a higher amount of video memory in the registry. I haven't looked at this at all yet but perhaps the current code has some tracking bugs or perhaps the fact that we don't track video memory of managed resources perhaps confuses some games. It could also be something else but perhaps you have seen things like this.
Roderick