On Fri, May 28, 2004 at 01:59:02PM +0100, Mike Hearn wrote:
On Fri, 28 May 2004 11:11:32 +0200, Rein Klazes wrote:
Why do you mean (wrong) ? Do you not get the dpi values used by X? Or is it that is uses the X values but those are wrong?
Well, I talked a bit with Huw about this so apologies to him if I mangled what he said, but ... it seems that the functions we use to get the size of the screen don't always return correct values. I don't know why.
Here X gets it from the monitor at start-up (via the logic of the video card). I did not realize that my situation is not normal.
If you mean via DDC then I guess this should work in most places, but for whatever reason it does not work here :( In particular, xdpyinfo says:
screen #0: dimensions: 1024x768 pixels (321x241 millimeters) resolution: 81x81 dots per inch
So it appears to know the screen size (I measured with a ruler and 321x241 is right), but it gets the DPI too small anyway. At 81 dpi the text is hardly readable.
Well actually 1024 pixels in 321mm is about 81dpi, so the XServer is doing the sums correctly. I guess what we need to know is what resolution Windows returns on this same machine.
Huw.