On Thu Feb 20 21:41:59 2025 +0000, William Horvath wrote:
Well, for one, I'd like to see an example of a real game that doesn't call timeBeginPeriod(1), but that's besides the point. We already simulate a 1ms timer period by having the tick count increase every millisecond, and this is absolutely not intended to change that. What this is addressing is unrelated server calls from other threads pushing the **global** `current_time` and `monotonic_time` (updated each timeout) forward. That causes the timeouts for alertable `SleepEx` (used in those games' frame limiters) to expire for up to 1ms from that last update, as it's ceil'd to the next millisecond to avoid spinning with a 0 timeout. This causes, on average, an additional 0.5ms waited, depending on where we land on the tick boundary.
Sorry, I still don't understand what's going on with the game. timeBeginPeriod() can only set 1ms resolution as best (does game do that?), and thus SleepEx() will go with 1ms sleep granularity, as I understand from your message, that's also the case on Wine? If the change helps the game but it is not what happens on Windows the actual problem might be elsewhere.