"Glenn Wurster" glenn@electric.ath.cx wrote:
ret = dc->funcs->pExtTextOut(dc->physDev,x,y,flags|ETO_IGNORELANGUAGE,
lprect,lpReorderedString,count,lpDx,dc->breakExtra);
lprect,lpReorderedString,count,lpDx,dc->breakExtra,antialias);
I think that it should be the ExtTextOut's backend responsibility whether to use antialiasing or not, taking into account ANTIALIASED_QUALITY and CLEARTYPE_QUALITY value of the LOGFONT.lfQuality field.
I think that it should be the ExtTextOut's backend responsibility whether to use antialiasing or not, taking into account ANTIALIASED_QUALITY and CLEARTYPE_QUALITY value of the LOGFONT.lfQuality field.
Actually, we can not just look at the font for this. The issue is that even though the font may support antialiasing, the display we are writing to may not. The first copy of the patch I had sent put the decision making in X11DRV_ExtTextOut, but Alexandre did not like that. He said "You have to add checks for a palette directly at the places that handle the anti-aliasing (and your palette check is wrong too, you shouldn't peek into the internals of the DC bitmap)."
Now, at the lower levels, we don't know what sort of BITMAP we are writing into. Here's the situation in which the problem appears:
1) We have a paletted bit map (offscreen), let's say 8 bits. If X is running in 24 bit colour, then the corresponding bitmap in X is going to be 24 bits.
2) The application writes text to the offscreen bitmap.
3) The application takes the data in the offscreen bitmap and extracts it/copies it to the screen.
The problem is step #2. We have font's capable of antialaising, and we have a 24 bit pixmap allocated in X, but the application expects it to be 8 bit. Therefore, there is already code in there to shrink it back down to a paletted pixmap so that the application can deal with it. If the only colour values in the bitmap are colours that the application wrote there directly, then we don't have a problem, since the only colours in the pixmap will be colours already in the palette. If, however, we have used antialiasing, then the story is different.
Let's say one pixel line of our font character is ...$*****$... (where the $ pixels are antialaised, * are full on, . are full off). When we write the text to the pixmap, then we are going to get colour values at $ which are not in the palette of the application (they'll be a mixture of the value for . and *). * and . will be, but $ will not.
Then, we go back and try and convert it back to a paletted image... The pixels at $ are not in the palette, and so in mapping the algorithm maps them to black... That means if we were doing a character like an I which has a ...$$... along the vertical line, then we just lost that vertical line. Mapping to the closest colour increases the running time of the mapping algorithm past acceptable limits.
Therefore, we need to make sure that $ never gets into the pixmap in the first place. How do we do this? Disable antialaising. It's not dependant on the font, it's dependant on what colour depth the application thinks the pixmap is (which is different from the colour depth the pixmap is in X).
That's why the patch passes the antialias flag in. Alexandre did not like the checking of the DC BITMAP internals in the X11DRV_ExtTextOut, but it needs to be done somewhere (since even apps that run in 24 bit may allocate a 8 bit BITMAP). I moved the checking of DC into the gdc code, since it fits there, but it results in me having to notify the lower levels of the result (the rest of the patch splits up the cache for the antialiased and non-antialiased glyphs so we don't clobber anything bad in X11DRV).
I can send pictures if requested, but I'm not at the computer with my source right now. Again, this check for antialiasing is independant of the font's capabilities (and indeed the font capabilities are checked elsewhere).
Further comments, suggestions? I hope I explained the problem clearly enough.
Glenn.