Roderick Colenbrander ha scritto:
Having an environment variable is a minor detail. Sure it should
be used transparently but it will take a lot of time (when the architecture is correct) to enable it by default.
IMHO the engine will (if it ever enters on main trunk....) stay disabled by default for long, long time. Having it to work ok to the same extent as is gdi32/winex11 now will take a lot of time and patches.
A lot of performance tuning would be needed as it could seriously decrease performance in a lot of cases.
I agree completely on this
The gain / loss depends on what type of calls the program is making. If it draws lets say on a line by line basis then software rendering could help. But if it uses a smarter method a single X call might be more efficient.
More precisely, it will depend on the degree of optimization of dib engine code which, by now, is close to zero. When blitting will be optimized, for example, the performance increase will be huge.
Further there are also other tradeoffs e.g. when the latest version of the drawable is in X then it might not be smart at all to copy it back to the DIB engine and let it do the rendering just because we have a DIB engine. The cost for the download and reupload to the videocard might be much higher.
The engine, as it is now, just renders on DIBs, when they're in memory, not on DDB. So it shouldn't penalize at all the rendering of drawables already owned by X. It'll just avoid the double copy between dib and x ant the way back just to, for example, render some text. That happens now in autocad, for example, which slows it down by a factor off 100. The only "mixed" behaviour is when you're blitting a dib onto a ddb (or the way around), but even then the code converts the source bitmap to the destination format and then uses the best way to blit it. It could even be optimized by blitting without converting before, but it would need some hard intervention on GDI code, which is what I wanted to avoid.
Ciao
Max