I'm new to Wine development. Today I decided to have a better look at how Wine is built rather than just blindly running 'configure' and hoping that all would work out. I'm a FreeBSD developer, so I tend not to trust 'configure' to DTRT. Don't shoot me for that, please. 8-)
I came across code in libs/wine/config.c that does a runtime check on the size of a couple of fields in 'struct stat' and executes code conditionally based on the size of the field being greater than 32 bits. Since I was using -Wall and -Werror, the compiler barfed when it saw an attempt to shift >= the number of bits in the variable. If I had my way, this sort of thing should be coded as a compile time test and only the code for the appropriate size compiled in. Is there some sort of direction for/from Wine developers for this kind of thing?
On Sun, 15 Feb 2004 14:47:28 +1100, John Birrell wrote:
If I had my way, this sort of thing should be coded as a compile time test and only the code for the appropriate size compiled in. Is there some sort of direction for/from Wine developers for this kind of thing?
We typically favour runtime checks as Wine is intended to be binary portable (compile once, run on lots of different distros etc).
I wouldn't consider a compiler warning a valid reason for removing this sort of thing, the users benefit considerably from it.
On Sun, Feb 15, 2004 at 01:13:13PM +0000, Mike Hearn wrote:
We typically favour runtime checks as Wine is intended to be binary portable (compile once, run on lots of different distros etc).
Runtime checks are fine for things that are evaluated at runtime.
In this case, the code that I am questioning is:
if (sizeof(st.st_dev) > sizeof(unsigned long) && st.st_dev > ~0UL)
sizeof() is evaluated at compile time based on the compiler and the header files you are compiling the sources against. If, as you say, "compile once, run on lots of distros", then those distros had better have a consistent definition of struct stat or the code won't function as the programmer intended.
Note that the code I am questioning does not produce a compiler warning. The warning comes from code in the if-true path.
The compiler warning attracted my attention to code that if believe would be better as a #if.
John Birrell jb@cimlogic.com.au writes:
Note that the code I am questioning does not produce a compiler warning. The warning comes from code in the if-true path.
The compiler warning attracted my attention to code that if believe would be better as a #if.
No, using normal code is always better than #ifdefs. This way all the code gets compiled and checked for errors, plus you don't need to write autoconf magic for things that the compiler can do itself.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On Sunday 15 February 2004 21:11, Alexandre Julliard wrote:
The compiler warning attracted my attention to code that if believe would be better as a #if.
No, using normal code is always better than #ifdefs. This way all the code gets compiled and checked for errors, plus you don't need to write autoconf magic for things that the compiler can do itself.
And a half-decent compiler will figure out that those checks are constant anyway and optimize the dead code away.
(But then there's allways the issue of trusting your compiler to do the job right...)
Florian