https://bugs.winehq.org/show_bug.cgi?id=17233
--- Comment #16 from hidefromkgb@gmail.com --- (In reply to LinuxInTrouble from comment #15)
Thank you for this patch. It's really working. Although i have a question. I have an optimus laptop. And if i use nvidia-prime package everything working just fine(except rather long start and some rare fps drops in menus)
- but due to some perfomance issues (nvidia 310m), i forced to use bumblebee
- it's working really better with my video card (don't know why), and if
i'm trying to start the game with optirun or primusrun commands - game starts - all sounds are working, but there is no image on the screen at all. Any ways to fix that?
It seems that 310M does not support BGR/888. I haven`t used NVidia since 2007, so I can`t even guess the format it might need. Four main things in question are its RGBA bitmasks. Knowing these I would be able to forge a suitable DDPIXELFORMAT structure. Can you please add my printf() to this code and compile/run it on your machine, and then show the screenshot and post its output?
https://www.opengl.org/wiki/Programming_OpenGL_in_Linux:_GLX_and_Xlib
printf("\nscreen = %d, depth = %d, class = %d\nR = %08X, G = %08X, B = %08X\nmap = %d, bits = %d\n\n", vi->screen, vi->depth, vi->class, vi->red_mask, vi->green_mask, vi->blue_mask, vi->colormap_size, vi->bits_per_rgb);
The code and its build line are given at the bottom of the wiki page. The printf() I need is just above. It has to be inserted right before glEnable(GL_DEPTH_TEST). I haven`t tested this line, but I hope it will work as expected.