Kaixo,
I'm planning to create/fix/check all the possible NLS files.
There is, however, an entry on those files that is problematic for me, namely the LOCALE_FONTSIGNATURE one.
From the Microsoft docs I understand it is made of 3 bitfields,
one 128bit for unicode coverage, and two 64bit ones for codepage ("default" and "supported").
For those last two there is no problem, after the normal endianess switching I get an hex number with bits that make sens. eg for rus.nls file I have a fontsignature being: L"\x0203\x8000\x3848\x0000\x0000\x0000\x0000\x0000\x0004\x0000\x0000\x0002\x0004\x0000\x0000\x0202"
the default and supported codepage parts are respectively:
\x0004\x0000\x0000\x0002 \x0004\x0000\x0000\x0202
after switching \x(1)\x(2)\x(3)\x(4) to 0x(4)(3)(2)(1) I get:
0x0002000000000004 and 0x0202000000000004
which perfectly match bits 2, 49, 57; code pages 1250, 866, 855
But for the unicode coverage part I'm lost...
In the LOCALE_FONTSIGNATURE it is: \x00a3\x8000\x7878\x38c9\x0016\x0000\x0000\x0000
In order to get a 128 bit number that makes sense I had to do the following switching:
\x(1)\x(2)\x(3)\x(4)\x(5)\x(6)\x(7)\x(8) to 0x(2)(7)(6)(5)(4)(3)(8)(1)
but it seems strange to me.
Also, font signatures for other nls don't make sense with that substitution.
So; how should I made the conversion between the format used in *.nls files and the 128 bit number used as bitfield to calculate the flags? Are the values in *.nls files trusty?
Thanks