On Tue, 18 Oct 2005, Michael Carlson wrote: [...]
For example, in Civilization 3, some 98% of the CPU usage in x11drv is taken up by two functions: convert_555_to_565_asis, and convert_565_to_555_asis. This summer I had a similar problem on my laptop, where fce ultra was using the function convert_888_to_0888_asis for most of its cpu.
My understanding is that the right fix for this kind of problem would be to write a DIB engine.
A quick workaround would be to try to make sure the X image format matches the format of the DIBs that the program manipulates, i.e. to get X's screen into 555 mode if 555 DIBs is what your application uses. Of course, if you then run an application that uses 888 DIBs you'd have to switch X to 888 mode for optimal performance, etc. That's why it's only a quick workaround/hack.
I'm looking at a group of related functions in wine, in dlls/x11drv/dib.c. It came to my attention that in X11DRV_SetDIBits and X11DRV_SetDIBitsToDevice, wine always seems to end up calling X11DRV_DIB_SetImageBits, which calls (in my case, probably because my desktop is 16-bit color) X11DRV_DIB_SetImageBits_16, which in every case calls a convert function of one type or another (all of which are CPU hogs). These are the core functions that seem to be related to the problem.
I thought Wine already kept a cache of the X pixmaps for a given DIB, only converting from DIB to pixmap or back when needed, which can be pretty often if the application alternates reading the DIB bits and using GDI operations on it.
Anyway, grepping for 'DIB Engine' in the list archives and on the web should turn up useful information.