Hi, Alexandre,
Is there something wrong with my patch? Did I miss something important?
I'm not at all convinced that the conversion has to happen at that point. Do you have a test app that demonstrates this?
I've written test application, which writes control chars and performs readback for comparision.
It's behaviour heavily depends on console font: 1) True Type (lucida). Control chars are displayed as square boxes, ReadConsoleOutputCharacters[A|W] return codes [0..31]. 2) Raster (fixedsys). Control chars are correctly displayed as glyphs, ReadConsoleOutputCharacterA returns codes [0..31], ReadConsoleOutputCharacterW returns unicode glyphes.
It seems I stumbled upon the same strange insane console behaviour I observed while working upon console codepages. (http://www.winehq.org/pipermail/wine-devel/2007-May/056511.html)
Since wine renders fonts in his specific way (freetype, AFAIK), I propose the following: 1) Display control characters as glyphes. (like fixedsys). 2) ReadConsoleOutputCharacter[A|W] both should return characters without any corrupting - if I wrote [0..31] I expect [0..31], if I wrote unicode glyphes, I expect unicode glyphes to be returned. (like TrueType)
My patch assumes this behaviour.
Of course, I wanted to write proper wine test case, but it is impossible due to different default console fonts under Windows - test will fail.
-- Kirill