Let me start off with an introduction. I'm a veteran C programmer, but I'm unfamiliar with programming X11 and unfamiliar with wine, although I've been looking into both recently to correct a problem I'm having with wine. I posted about the same problem in August on the same mailing list, about making function parameters const. This was a simplistic solution, but I suggested it simply because I didn't feel I had the time to familiarize myself with all the inner-workings of wine. Now, I believe I do have the time.
To repeat the situation, I found that in several of the games I like to run, the same problem is causing wine to be slow: using oprofile, I find that nearly all of wine's CPU usage lies in x11drv, and 98-99% of the cpu usage in x11drv is taken up by calls to some of the convert functions in dib_convert.c. For example, in Civilization 3, some 98% of the CPU usage in x11drv is taken up by two functions: convert_555_to_565_asis, and convert_565_to_555_asis. This summer I had a similar problem on my laptop, where fce ultra was using the function convert_888_to_0888_asis for most of its cpu.
Essentially, it seems that most of the work wine is doing when running these programs is just for converting between one pixel format and back, and I hope to find a good solution to this speed problem, preferably through caching. I'm writing to this list in hope of other more experienced wine hackers giving their advice, telling me what is possible, and what the best solution is, before I go too far on my own with this.
I'm looking at a group of related functions in wine, in dlls/x11drv/dib.c. It came to my attention that in X11DRV_SetDIBits and X11DRV_SetDIBitsToDevice, wine always seems to end up calling X11DRV_DIB_SetImageBits, which calls (in my case, probably because my desktop is 16-bit color) X11DRV_DIB_SetImageBits_16, which in every case calls a convert function of one type or another (all of which are CPU hogs). These are the core functions that seem to be related to the problem.
Here is my early analysis of the problem: X11DRV_DIB_SetImageBits always creates and destroys an XImage during the life of the function, and calls X11DRV_DIB_SetImageBits_16 (or another bit depth), which ends up converting the bitmap to the proper color format / bit depth. It seems to me the best solution is to use some kind of caching to save the converted bitmap in its cached form, preferably as an XImage. Inside X_PHYSBITMAP the HBITMAP is stored, windows' unique handle for that particular bitmap. So, what if we stored the appropriate converted XImage per-HBITMAP, per-bpp, so that it doesn't need to be created, converted, and destroyed each time, and we could delete all cached XImages corresponding to the HBITMAP from the cache when DeleteObject() is called for that HBITMAP?
I'm doing a lot of guesswork on the inner-workings of wine, X, and the windows GDI here. Please tell me, am I on the right track for eliminating these unnecessary conversions? If not, where should I be looking? Again I'm new to wine, X, and somewhat new to GDI programming - but I would like to learn. Please, GDI/X11 experts, give me some advice on the best solution to this problem!
On Tue, 18 Oct 2005, Michael Carlson wrote: [...]
For example, in Civilization 3, some 98% of the CPU usage in x11drv is taken up by two functions: convert_555_to_565_asis, and convert_565_to_555_asis. This summer I had a similar problem on my laptop, where fce ultra was using the function convert_888_to_0888_asis for most of its cpu.
My understanding is that the right fix for this kind of problem would be to write a DIB engine.
A quick workaround would be to try to make sure the X image format matches the format of the DIBs that the program manipulates, i.e. to get X's screen into 555 mode if 555 DIBs is what your application uses. Of course, if you then run an application that uses 888 DIBs you'd have to switch X to 888 mode for optimal performance, etc. That's why it's only a quick workaround/hack.
I'm looking at a group of related functions in wine, in dlls/x11drv/dib.c. It came to my attention that in X11DRV_SetDIBits and X11DRV_SetDIBitsToDevice, wine always seems to end up calling X11DRV_DIB_SetImageBits, which calls (in my case, probably because my desktop is 16-bit color) X11DRV_DIB_SetImageBits_16, which in every case calls a convert function of one type or another (all of which are CPU hogs). These are the core functions that seem to be related to the problem.
I thought Wine already kept a cache of the X pixmaps for a given DIB, only converting from DIB to pixmap or back when needed, which can be pretty often if the application alternates reading the DIB bits and using GDI operations on it.
Anyway, grepping for 'DIB Engine' in the list archives and on the web should turn up useful information.