On Sun Mar 22 22:14:53 2026 +0000, Nikolay Sivov wrote:
I don't think this was ever supported at this level, on Windows. Also it's not clear to me why WPF would be using any of that. You are right. I checked the Windows pipeline output and it confirms your point. GetGlyphIndicesW returns 0xffff for surrogate pairs even with Segoe UI Emoji:
font.c:1745: Test failed: surrogate pair high: expected valid glyph, got 0xffff font.c:1754: Test failed: first pair high: expected valid glyph, got 0xffff\ font.c:1758: Test failed: second pair high: expected valid glyph, got 0xffff font.c:1763: Test failed: two different emoji should have different glyph indices: 0xffff vs 0xffff So Windows does not handle surrogate pairs at the GDI level in GetGlyphIndicesW. The function treats each WCHAR independently, just like Wine currently does. The reason I initially looked at GDI was that WPF text rendering under Wine falls back to GDI paths (nulldrv_ExtTextOut, get_total_extents), wich is where the "two squares per emoji" artifact appears. On Windows, WPF uses DirectWrite for text rendering, so the surrogates are handled at a higher level and never reach GetGlyphIndicesW. I will close this MR and investigate where the fix belongs in the DirectWrite layer (dlls/dwrite/) instead. If there is a better place to look at, I would appreciate any pointers. Sorry for the noise. -- https://gitlab.winehq.org/wine/wine/-/merge_requests/10422#note_133249