http://bugs.winehq.org/show_bug.cgi?id=33597
Bug #: 33597 Summary: Enhancing memory video detection Product: Wine Version: 1.5.29 Platform: x86-64 OS/Version: Linux Status: UNCONFIRMED Severity: normal Priority: P2 Component: opengl AssignedTo: wine-bugs@winehq.org ReportedBy: sworddragon2@aol.com Classification: Unclassified
I'm using a GeForce 8600 GT with the following attributes:
sworddragon@ubuntu:~$ nvidia-settings -q all | grep -E "Attribute '(TotalDedicatedGPUMemory|VideoRam)'" Attribute 'VideoRam' (ubuntu:0.0): 524288. Attribute 'VideoRam' (ubuntu:0[gpu:0]): 524288. Attribute 'TotalDedicatedGPUMemory' (ubuntu:0[gpu:0]): 512.
The PCI device ID is 0x0402 and the PCI vendor ID is 0x10de. On starting a game Wine is telling me that it is only emulating 256 MiB of memory for my graphics card:
sworddragon@ubuntu:~$ WINEDEBUG=+d3d '/wine/drive_c/Program Files (x86)/Warcraft III/Frozen Throne.exe' -opengl 2>&1 | grep Emulating trace:d3d:wined3d_adapter_init Emulating 256 MB of texture ram.
http://bugs.winehq.org/show_bug.cgi?id=33597
sworddragon2@aol.com changed:
What |Removed |Added ---------------------------------------------------------------------------- Severity|normal |enhancement
http://bugs.winehq.org/show_bug.cgi?id=33597
sworddragon2@aol.com changed:
What |Removed |Added ---------------------------------------------------------------------------- Summary|Enhancing memory video |Enhancing video memory |detection |detection
http://bugs.winehq.org/show_bug.cgi?id=33597
Jarkko K jarkko_korpi@hotmail.com changed:
What |Removed |Added ---------------------------------------------------------------------------- CC| |jarkko_korpi@hotmail.com
--- Comment #1 from Jarkko K jarkko_korpi@hotmail.com --- I can confirm your issue by looking source code
http://source.winehq.org/git/wine.git/blob/b01fc1aa6e12b426b3f539c44196c9233...
They should somehow improve this thing...
but there is a workaaround for you
http://wiki.winehq.org/UsefulRegistryKeys
| +->VideoMemorySize | | [Sets the amount of reported video memory (in megabytes). The default is a simple | | auto-detection based on the card type guessed from OpenGL capabilities.]
https://bugs.winehq.org/show_bug.cgi?id=33597
--- Comment #2 from Austin English austinenglish@gmail.com --- This is your friendly reminder that there has been no bug activity for over a year. Is this still an issue in current (1.7.51 or newer) wine?
https://bugs.winehq.org/show_bug.cgi?id=33597
--- Comment #3 from sworddragon2@aol.com --- With Wine 1.8 I'm getting this on a GeForce GTX 650:
sworddragon@ubuntu:/wine/drive_c/Program Files (x86)/Warcraft III$ nvidia-settings -q all | grep -E "Attribute '(TotalDedicatedGPUMemory|VideoRam)'" Attribute 'VideoRam' (ubuntu:0.0): 2097152. Attribute 'VideoRam' (ubuntu:0[gpu:0]): 2097152. Attribute 'TotalDedicatedGPUMemory' (ubuntu:0[gpu:0]): 2040. sworddragon@ubuntu:/wine/drive_c/Program Files (x86)/Warcraft III$ WINEDEBUG=+d3d './Frozen Throne.exe' -opengl 2>&1 | grep Emulating trace:d3d:wined3d_adapter_init Emulating 0x40000000 bytes of video ram.
Using OpenGL capabilities is an enhancement but still a problem if vendor-specific graphic cards have a different amount of RAM then the reference graphic cards.
https://bugs.winehq.org/show_bug.cgi?id=33597
Nikolay Borodin monsterovich@gmail.com changed:
What |Removed |Added ---------------------------------------------------------------------------- CC| |monsterovich@gmail.com
--- Comment #4 from Nikolay Borodin monsterovich@gmail.com --- This is still a problem in 2022 in games that use Vulkan, not just OpenGL. In Red Dead Redemption 2, if there is not enough video memory, the game will not let you put high graphics settings and even screen resolution. In other games, for example, it may work fine, but constantly popup that there is not enough VRAM.
wine --version wine-7.14 (Staging)
The GPU table lacks NVidia 30 series GPUs like "NVIDIA GeForce RTX 3060 Ti".
https://github.com/wine-mirror/wine/blob/wine-7.14/dlls/wined3d/directx.c#L3...
https://bugs.winehq.org/show_bug.cgi?id=33597
--- Comment #5 from Nikolay Borodin monsterovich@gmail.com --- GPU tables suck because your GPU, like my RTX 3060 Ti, may not be listed and you have to set the VRAM manually via VideoMemorySize in regedit.