The approach taken so far consisted in having 2 device pointers inside GDI32, one for dib engine and the other for normal display driver. This way had the disadvantage of having to keep in sync the DC with the right driver depending on selected bitmap, which lead to many small changes spread along all gdi code; going deeper with development this approach showed many limits and problems.
So I decided to start again from scratch with a completely different approach which is giving good results and is quite less invasive on gdi32 code. Instead of doing :
-- DIB ENGINE / GDI32 --- \ -- X11 DRIVER
I took this approach :
GDI32 -- DIB ENGINE -- X11 DRIVER
The (X11) display driver is loaded from inside the engine, which replaces it from gdi32 point of view. The changes in gdi32 are *very* limited, just in driver.c code, making it to load (if desired) the engine instead of normal x11 driver. No other changes needed. I added as usual the code allowing to enable/disable the engine on request by registry and/or environment variable. If the engine is not present or disabled, the driver loader falls back to usual behaviour. The Engine then loads the X11 driver in init phase and acts as a gate depending on selected BMP, forwarding to X11 driver all requests for DDB bitmaps and doing the job itself for DIB ones.
This approach showed many advantages, and I' almost ready converting all old code to it. By now I'm posting here 3 patches showing the approach; the posted driver is a simple pass-throu one, so it just forwards all calls to X11 driver. The last patch of the series shows the forking behaviour of DIB/DDB processing from inside the engine, but still forwarding all to X11.
I'd like some comments on the approach taken; on next days I'll post a more complete code with most of the engine implemented.
Ciao
Max
On Tue, Apr 14, 2009 at 1:22 AM, Massimo Del Fedele max@veneto.com wrote:
The approach taken so far consisted in having 2 device pointers inside GDI32, one for dib engine and the other for normal display driver. This way had the disadvantage of having to keep in sync the DC with the right driver depending on selected bitmap, which lead to many small changes spread along all gdi code; going deeper with development this approach showed many limits and problems.
So I decided to start again from scratch with a completely different approach which is giving good results and is quite less invasive on gdi32 code. Instead of doing :
-- DIB ENGINE /
GDI32 --- \ -- X11 DRIVER
I took this approach :
GDI32 -- DIB ENGINE -- X11 DRIVER
The (X11) display driver is loaded from inside the engine, which replaces it from gdi32 point of view. The changes in gdi32 are *very* limited, just in driver.c code, making it to load (if desired) the engine instead of normal x11 driver. No other changes needed. I added as usual the code allowing to enable/disable the engine on request by registry and/or environment variable. If the engine is not present or disabled, the driver loader falls back to usual behaviour. The Engine then loads the X11 driver in init phase and acts as a gate depending on selected BMP, forwarding to X11 driver all requests for DDB bitmaps and doing the job itself for DIB ones.
This approach showed many advantages, and I' almost ready converting all old code to it. By now I'm posting here 3 patches showing the approach; the posted driver is a simple pass-throu one, so it just forwards all calls to X11 driver. The last patch of the series shows the forking behaviour of DIB/DDB processing from inside the engine, but still forwarding all to X11.
I'd like some comments on the approach taken; on next days I'll post a more complete code with most of the engine implemented.
What about other drivers? Is the DIB driver going to know how to handle the others then?
Does it even make sense to keep the DIB engine a driver anymore?
Jesse
Jesse Allen ha scritto:
What about other drivers? Is the DIB driver going to know how to handle the others then?
The engine act as a filter between gdi32 and the DISPLAY driver, other stuffs are untouched. The changes on gdi32 are just to "prefere" the loading of the engine instead of normal display driver, IF the engine is available and IF it's enabled in registry/environment. When loaded, the engine resumes normal display driver loading, as was done previously in gdi32. Identical stuff, defaulting to wineX11.drv if not changed by registry key. The engine then forwards to X11 (or any other display driver) all calls related to DDB, and process directly the DIBS. He don't need to now anything about which display driver is loaded or it's internals... it just makes call to him as gdi32 do. This approach has 2 big advantages :
1) you don't have to fiddle with bitmap and dc function pointers inside gdi32, which revealed itself a very complicated and error prone stuff. Now gdi32 is unchanged.
2) having DIB engine acting as a display driver, it can be extended step by step to handle DDB too, if whished. Or, it can be made to handle only DIBs which format differs from X11 display's one, which are the true speed problem.
Does it even make sense to keep the DIB engine a driver anymore?
The alternative is, as usual, to rewrite a big part of gdi32 and x11 driver, and this can't be done step by step. The fact that X11 keeps a DDB copy of DIB inside it makes it almost impossible.
Jesse
Ciao
Max
"Massimo Del Fedele" max@veneto.com wrote:
The approach taken so far consisted in having 2 device pointers inside GDI32, one for dib engine and the other for normal display driver.
Please don't post huge attachments to the mailing list in future, post url for them instead.
On Tue, Apr 14, 2009 at 1:22 AM, Massimo Del Fedele max@veneto.com wrote:
The approach taken so far consisted in having 2 device pointers inside GDI32, one for dib engine and the other for normal display driver. This way had the disadvantage of having to keep in sync the DC with the right driver depending on selected bitmap, which lead to many small changes spread along all gdi code; going deeper with development this approach showed many limits and problems.
So I decided to start again from scratch with a completely different approach which is giving good results and is quite less invasive on gdi32 code. Instead of doing :
-- DIB ENGINE /
GDI32 --- \ -- X11 DRIVER
I took this approach :
GDI32 -- DIB ENGINE -- X11 DRIVER
I'm trying to understand what the problem is to make you think there needs to be a change. Are most of the problems with Blt related functions?
It is my understanding the DIB engine should actually be able to call the display driver and vice-versa. So I think we shouldn't abandon the two driver approach, just add the capability to call each other. I believe GDI on windows has infrastructure for this too.
Jesse Allen ha scritto:
I'm trying to understand what the problem is to make you think there needs to be a change. Are most of the problems with Blt related functions?
No
It is my understanding the DIB engine should actually be able to call the display driver and vice-versa. So I think we shouldn't abandon the two driver approach, just add the capability to call each other. I believe GDI on windows has infrastructure for this too.
As I said before, I firmly think that the DIB engine belongs to GDI and *not* to an external driver. That said, I was told that the "right" approach would have been : 1) a gradual introduction of the engine inside wine. 2) it shouldn't break anything on the way 3) it should be done in small patches
I guess you've seen too, developing your engine, that all those requirement are impossible to meet at once. DIBs are by now too tightly integrated inside X11. The 2 driver approach was a compromise and, trying to use it, I have seen that it was cumbersome to maintain and very error prone. You have 2 DC function pointers, 2 DC physical devices and, the really bad stuff, the function pointers inside the bitmap structure that should be kept in sync with the DC ones. Even more, gdi code use bitmap function pointers to tell to which kind of DC belongs the bitmap. The SetOwnerDC() along needed to be patched because of that, and it wasn't enough, still. You can look at my previous code.... many, many small hacks here and there just to be sure that all pointers were kept in sync. The bitblt stuff was one of the easier part.... you can also see it in my new code, it's completely working now, beside the pattern blitting that need some more work. Then I thought.... all that stuff for what ? Just to have a compromise engine semi-embedded into gdi32 ? An engine that, to make it "right" would mean another almost full gdi and x11 rewrite ? An engine on which every small unrelated change in gdi32 could bring nasty bugs on the engine code ? And a stuff that had to be manually rebased on each change inside gdi32, hopelessly waiting to see it embedded in wine main tree ? No, thanx. The "right" way to do it would be, in my humble opinion, to "fork" X11 and gdi32 parts and to rewrite them moving all dib processing from x11 to gdi32, besides keeping, maybe, a pixmap copy of dib inside X11. But even being (maybe) able to do it, I guess that almost nobody would take the job without being sure that it will enter into main three some day.... at least, not me :-) So, mostly because I *need* the engine for my job, and because I was tired of manual rebasing stuff, and, last but not least, because I think it's the only viable way to prepare the stuff for the "right" migration of DIB inside gdi, I rewrote it with new approach. Advantages : 1) Almost no changes, by now, to gdi32 code nor to X11 code. The "almost" could also be "none at all", if wished. Just insert "winedib.drv" on registry driver's names and the job is done. I did put the small drive load patch on gdi32 because I wanted it working without fiddling with the registry, and it's about 5 patched lines and a some 30 lines added. No more no less. No more hassles to maintain it on each git release. 2) It can't, by design, break anything. If disabled, it's just as it wouldn't exist. 3) Once setup and tested, it can be a good starting point to do the "right" approach, so to move DIB X11 code inside gdi32. When you have all DIB code inside winedib.drv, you could then, really step-by-step add some X11 optimizations and then move DIB code inside gdi32. When finished, the temporary winedib.drv would simply disappear.
I can tell you, it took me more than 2 months to make it "almost working" with the 2 driver approach, and a whole couple of days (!!!) to make it *fully* working with the new approach.... and much more stable.
Ciao
Max