https://bugs.winehq.org/show_bug.cgi?id=51420
--- Comment #48 from Henri Verbeet hverbeet@gmail.com --- (In reply to Sveinar Søpler from comment #45)
(In reply to Henri Verbeet from comment #44)
Of course it would be even better if the NVIDIA drivers would synthesise the standard display modes when using RandR 1.4+ on configurations like this, like pretty much all the other Linux GPU drivers do, but I'm not holding my breath.
Is this a particular DVI-D problem, and does not rear its ugly head with HDMI or DP configurations, or should i see the same issue there?
My understanding is that it largely depends on whether the display in question has its own (hardware) scaler, or depends on the GPU for that. In the past, that largely correlated with whether the display was internal or external. Internal (typically LVDS) displays in e.g. laptops typically didn't have scaler hardware, while external monitors connected by VGA or DVI typically did. It's a little less straightforward with modern external displays that may not support VGA/DVI-I.
(Note incidentally that DVI-I is essentially the same as VGA, and DVI-D is essentially the same as HDMI; converters between these are typically passive.)