Removing the Yield fixes a regression in Need For Speed III where the loader and server consume 100% of the CPU. This is with RH 9 which is a 2.4 kernel.
When did that regression first start? The mmtime and ntdll/sync.c code has been this way since late last fall.
The Yield is, imho, correct, except that it exposes the fact that we don't correctly implement Windows priority schemes. That can create nasty conditions.
But I wonder if something else changed that is causing this.
I'm particularly surprised to find it on a 2.4 kernel; the 2.4 kernel quanta was ~ 40 ms, which made Linux a bit less sensitive to cpu hogging threads.
Cheers,
Jeremy