On Fri, 28 May 2004 11:11:32 +0200, Rein Klazes wrote:
Why do you mean (wrong) ? Do you not get the dpi values used by X? Or is it that is uses the X values but those are wrong?
Well, I talked a bit with Huw about this so apologies to him if I mangled what he said, but ... it seems that the functions we use to get the size of the screen don't always return correct values. I don't know why.
Here X gets it from the monitor at start-up (via the logic of the video card). I did not realize that my situation is not normal.
If you mean via DDC then I guess this should work in most places, but for whatever reason it does not work here :( In particular, xdpyinfo says:
screen #0: dimensions: 1024x768 pixels (321x241 millimeters) resolution: 81x81 dots per inch
So it appears to know the screen size (I measured with a ruler and 321x241 is right), but it gets the DPI too small anyway. At 81 dpi the text is hardly readable.
I agree with you that 95 dpi is about the minimum that is being used today. But I would keep this the minimum: if the dpi is set bigger then it is probably intentional. Either monitor magic or from a careful system administration.
Yep, this only affects the default when nothing is specified in the config file (in fact in the default config file this was always specified anyway so this is just restoring the old behaviour in the absence of he config file really).
thanks -mike