http://bugs.winehq.org/show_bug.cgi?id=27298
--- Comment #19 from DL dredgingthelake@gmail.com 2012-10-26 03:01:42 CDT --- (In reply to comment #18)
The final tally amounted to a 54% gain in a given scene over standard launch.
I tried it out and couldn't reproduce the difference on my machine. Are you using a composite manager? I'm using Ratpoison which is a quite minimal window manager so perhaps that is where the difference lies. However, a couple few things that may be confounding your results:
1) Loading nvidia-settings when running in a different XServer?
You might not get the same settings if you don't do this. I lost my AA and VSync settings which could account for some performance differences (although unlikely of the magnitude you found)
2) What are you using to track FPS? If you are using WINEDEBUG=+fps, you'll find that your FPS will jump to double what it was before once you switch back to your main xserver. However, you should see a string of lower fps counts before it jumps which will indicate the real FPS.
Anyway I'm betting it's a composite manager issue. In my case I do get a larger FPS decrease with Witcher 2 than in other games, but I'm not sure it's outside of the normal range. I.e 40-45% of windows with Witcher 2, vs 50-65% of Windows with other games. I think because I am GPU limited it may be causing additional issues with Witcher 2, though.
BTW the threaded optimisations only makes a small difference for me with this game, although it makes a large difference with some other games, like the Witcher 1.