On Tue Nov 21 08:13:16 2023 +0000, Matteo Bruni wrote:
Font rendering isn't exactly my specialty so I want to be sure I understand the issue here. BTW, you don't need to go and find an answer to all my questions, I'm just trying to dump your brain WRT what you already found out :sweat: In particular, you mention font quality affecting d3dx9 output with native but not with our implementation. How does that work exactly? Notice that we pass the quality setting over to CreateFontW(). Is that not enough? Is GGO_GRAY8_BITMAP just not right for non-antialiased fonts? Or is it a case where things don't work properly unless you draw the whole string at the same time? The DRAFT_QUALITY thing seems mostly orthogonal to d3dx9, as I understand it. I expect these tests to generally have room for improvement. From what I understand from your results, right now they're effectively passing by chance. So, while adding a few todo_wine to keep the tests passing might be okay for the time being, I'd prefer to fix the tests (which might even mean getting rid of a few of them) or at least figure out what needs to be fixed. It would be unfortunate to mark some test as todo_wine, implying that the implementation is broken, when in fact it's the test that's "too strict", for some definition of it. Looking at the whole MR, would it make sense to pass GGO_GRAY8_BITMAP to the GetGlyphOutlineW() calls in the d3dx9 tests? Or is that how you found out that the implementation is not correct in that regard?
WRT font quality (essentially, antialiasing). How does that work in gdi (at least Wine, but based on what I tested on Windows on various occasion it is functionally similar, apart from some bugs). CreateFontW() does nothing, it merely stores LOGFONT structure under handle. Something happens when you SelectFont() into hdc (gdi device context). Here the antialias settings are actually guessed. A few thing affects that: font quality settings in LOGFONT ( CreateFont parameter), font own settings, preferences coming from device under device context. If that is screen compatible device the default option come from system settings (specifically, for DRAFT_QUALITY; both on Windows and in Wine too, at least on x11, it is affected by desktop settings). All that (rather convoluted) process ends up in antialias settings to be decided in hdc, but nothing else happens yet. What this hdc settings affect are user32 / gdi32 font rendering functions, like DrawText, ExtTextOut will effectively use the antialiasing in hdc. d3dx9 currently uses explicit GetGlyphOutlineW(GGO_GRAY8_BITMAP). This is different. I don't know a way to get glyph bitmap with "hdc selected font default" from it, whenever you request glyph data format stipulates the antialiasing (and metrics). So GetGlyphOutlineW(GGO_GRAY8_BITMAP) is going to return antialiased glyph (at least if font supports that at all) regardless of any settings (including hints to CreateFont). GetGlyphOutlineW(GGO_MEASURE) is special here, it result may depends on "font in hdc" settings but it doesn't return the glyph. I also didn't find a documented way to extract font antialiasing info from hdc. So to handle d3dx9 font antialiasing the same way as on Windows the only straightforward way I see is to stop using glyph data directly at all and use user32 or gdi32 function to draw glyph instead. I tested that d3dx9 glyph drawing on Windows follows gdi / hdc rules. I can attach an ad-hoc test which displays the letters on screen from d3dx9 rendering if that helps.
Looking at the whole MR, would it make sense to pass GGO_GRAY8_BITMAP to the GetGlyphOutlineW() calls in the d3dx9 tests? Or is that how you found out that the implementation is not correct in that regard?
That doesn't succeed on Windows.
WRT the tests, yes, I think they are passing by chance on Windows (on Wine not quite as cellinc calculation is identical to the test). E. g., changing font from "Tahoma" to "Arial" will make concerned tests fail for some characters. More so when changing font antialiasing settings. I spent some time trying to get more out of this test, but this is not straightforward, from those attempts I concluded that relation of cellinc to gmptGlyphOrigin is a bit off in general (across various fonts and antialiasing settings the error is more than 1 pixel on Windows). In the present test with "Tahoma" gmptGlyphOrigin.x is in fact always 0. If we want to match that we probably need to try to re-explore how that d3dx9 font metrics are estimated across the different fonts, and how that can be linked to glyph metrics. I am not entirely sure if that worth it as exact matching Windows accuracy is also not achieved in gdi now, and it might be very hard / impractical to do, if even possible.
I realize that such changes to the test (when the test itself doesn't reflect what it is supposed to) is a bit weird. Maybe we would rather remove these specific tests?