+/* 0x00..0x1f chars glyph map */ +static const WCHAR glyph_xlat[32] = { + 0x0000, 0x263A, 0x263B, 0x2665, 0x2666, 0x2663, 0x2662, 0x2219, + 0x25D8, 0x25CB, 0x25D9, 0x2642, 0x2640, 0x266A, 0x266B, 0x263C, + 0x25BA, 0x25C4, 0x2195, 0x203C, 0x00B6, 0x00A7, 0x25AC, 0x21A8, + 0x2191, 0x2193, 0x2192, 0x2190, 0x221F, 0x2194, 0x25B2, 0x25BC +}; + +/* adds glyphs to the string */ +static inline void add_glyphs( WCHAR *str, unsigned int length ) +{ + unsigned int i; + for (i=0; i!=length; i++) { + if (str[i]<0x20) str[i]=glyph_xlat[str[i]]; + } +}
I propose to make this routine extern, because it is the simpliest way to make console renderer (I mean user backend of wineconsole) display the characters below 0x20 correct.
The test: #include <windows.h>
int main(void) { DWORD d; char str[32]; int i;
GetConsoleMode(GetStdHandle(STD_OUTPUT_HANDLE), &d); d &= ~ENABLE_PROCESSED_OUTPUT; SetConsoleMode(GetStdHandle(STD_OUTPUT_HANDLE), d); for (i = 0; i < 32; i++) str[i] = i; WriteConsoleA(GetStdHandle(STD_OUTPUT_HANDLE), str, 32, &d, 0); d |= ENABLE_PROCESSED_OUTPUT; SetConsoleMode(GetStdHandle(STD_OUTPUT_HANDLE), d); return 0; }
Under wine we get strange symbols, under windows we get correct glyphs.
If we perform the same character conversion in wineconsole/user.c just before actual text output, we get almost the same result that Windows does. The only difference is handling \0 symbol.
-- Kirill
+/* adds glyphs to the string */ +static inline void add_glyphs( WCHAR *str, unsigned int length ) +{
- unsigned int i;
- for (i=0; i!=length; i++) {
- if (str[i]<0x20) str[i]=glyph_xlat[str[i]];
- }
+}
Page http://blogs.msdn.com/michkap/archive/2005/02/26/381020.aspx claims that symbol 0x7F should be converted too, but I did not check it.
-- Kirill