On Thu, 7 Mar 2019 17:15:52 +0300, Nikolay Sivov wrote:
On 3/7/19 4:45 PM, Akihiro Sagawa wrote:
+static BOOL get_cjk_font_margins(HDC hdc, BOOL unicode, SHORT *left, SHORT *right) +{
- ABC abc[256];
- UINT i;
- *left = *right = 0;
- if (!unicode) {
if (!GetCharABCWidthsA(hdc, 0, 255, abc))
return FALSE;
- } else {
if (!GetCharABCWidthsW(hdc, 0, 255, abc))
return FALSE;
- }
- for (i = 0; i < ARRAY_SIZE(abc); i++) {
if (-abc[i].abcA > *right) *right = -abc[i].abcA;
if (-abc[i].abcC > *left ) *left = -abc[i].abcC;
- }
- return TRUE;
+}
Is it possible to demonstrate this with some font, specifically modified to have one glyph significantly of?
Thanks for reviewing. I attached an archive file to Bug 46685[1]. It contains some test case and modified Tahoma font. Could you look into that?
[1] https://bugs.winehq.org/show_bug.cgi?id=46685#c5
It seems to me it's much more likely it would be using some averaged metric instead, specified by the font.
Regarding to the test result, I think it's not true. If WM_SETFONT used font metrics for margins, it would be no differences between A and W versions.
Akihiro Sagawa