https://bugs.winehq.org/show_bug.cgi?id=17233
--- Comment #17 from LinuxInTrouble petrovgeorguu@yandex.ru --- (In reply to hidefromkgb from comment #16) Ty for fast answer. Some issues with types, but they are not important in this case (as i understood). quad.c:77:1: warning: format ‘%X’ expects argument of type ‘unsigned int’, but argument 5 has type ‘long unsigned int’ [-Wformat=] printf("\nscreen = %d, depth = %d, class = %d\nR = %08X, G = %08X, B = %08X\nmap = %d, bits = %d\n\n", vi->screen, vi->depth, vi->class, vi->red_mask, vi->green_mask, vi->blue_mask, vi->colormap_size, vi->bits_per_rgb); ^ quad.c:77:1: warning: format ‘%X’ expects argument of type ‘unsigned int’, but argument 6 has type ‘long unsigned int’ [-Wformat=] quad.c:77:1: warning: format ‘%X’ expects argument of type ‘unsigned int’, but argument 7 has type ‘long unsigned int’ [-Wformat=]
Output for executiong quad: visual 0x20 selected
screen = 0, depth = 24, class = 4 R = 00FF0000, G = 0000FF00, B = 000000FF map = 256, bits = 8
Output for "optirun ./quad" visual 0x20 selected
screen = 0, depth = 24, class = 4 R = 00FF0000, G = 0000FF00, B = 000000FF map = 256, bits = 8
As i see they are exactly the same ). May be the problem is with bumblebee - it's using VirtualGl. Bumblebee renders frames for your Optimus NVIDIA card in an invisible X Server with VirtualGL and transports them back to your visible X Server. Frames will be compressed before they are transported - this saves bandwidth and can be used for speed-up optimization of bumblebee: