http://bugs.winehq.org/show_bug.cgi?id=11203
--- Comment #54 from Ian Goddard iang@austonley.org.uk 2012-11-27 12:03:14 CST --- I don't know. I've stuck with 1.0.1 hacked to remove what I consider to be a bug & the devs seemed to consider a feature (see the comments from Aug 2009).
The problem seems to be a deliberate design decision: to assume that a graphic driver which reports only 24-bit capability can, in fact, handle 32 bits & to act on that assumption in the interest of running games. If an application then sends a 24-bit image the program will crash. I have had several boards with 24-bit drivers and that assumption has been false for all of them.
Last time I looked - about a year ago - the code had been substantially changed but was still based on that assumption. Until that decision is rescinded the problem will remain. If it is rescinded the change required will be trivial.
It did seem to me quite feasible to add a switch in the registry so that the assumption could be applied or not depending on the H/W; I'd have been prepared to code it myself on the original version of the code. However I gained the impression that supporting Intel graphics wasn't of interest to the project and, having a quick hack that works, I don't need a more elaborate version myself.
I agree that last time I checked against a more recent Wine build 9.x builds of EA were OK but this is a feature of later EA builds,, not of Wine. As comment 53 from A L-H shows the problem is still there with the original EA. However there were duplicate bug reports against various other applications (VB was one IIRC) it may still be a problem for some Wine users who have different requirements.
I only have that one application which needs Wine. Although the current version of that no longer has a problem I've stuck with what already works for me: 1.0.1 with a one line change. I have long since stopped testing any more recent builds of Wine. I'd be happy retest if & when the devs decide to revisit the issue.
At present the code supports the following use case:
A board which reports 24bpp but can handle 32 is used to run an application which insists on having 32bpp.
The use case which fails is:
A board which reports 24bpp and can't handle 32 is used to run an application which will fall back to 24bpp if that's what's reported but will attempt to use 32 if that's reported.
How prevalent are these two use cases? At present the 2nd is being sacrificed to support the 1st. If the 1st isn't valid then I suggest a trivial code change to report 24bpp back to the application. Otherwise, make a decision to either add a registry switch to support both of them or mark as won't fix.