So, one of the things one learns when writing a patch robot is that flaky tests are very annoying.
Each time it gets a new git tree, the robot does five baseline "make -k test" runs, remembers the tests that fail, and doesn't penalize patches for failing any of those tests. See http://code.google.com/p/winezeug/source/browse/trunk/patchwatcher/patchwatc...
Annoyingly, that's not enough. Some tests stubbornly refuse to fail during the baseline test runs. So I added a second, manual blacklist for those tests; see http://code.google.com/p/winezeug/source/browse/trunk/patchwatcher/patchwatc... The list is currently user32:msg.c user32:input.c d3d9:visual.c ddraw:visual.c urlmon:protocol.c kernel32:thread.c and will continue growing as I keep plugging away at getting the patch robot happy.
Is anybody else seeing this kind of flakiness? (If you're not, try running patchwatcher for a while :-)
FWIW, I'm running the tests on hardy with a fresh metacity (as described in http://wiki.winehq.org/MakeTestFailures ).
Am Dienstag, den 12.08.2008, 10:58 -0700 schrieb Dan Kegel:
The list is currently user32:msg.c user32:input.c
Same problems here. Metacity 2.23.21, compiled myself.
d3d9:visual.c ddraw:visual.c
The last two are really nasty. Take a look at ddraw/tests/visual.c:2624. It makes a kind of "basic assurance test" whether it might be going to work. If this basic test fails, the whole visual test is skipped. The test passes only on my machine if it already fails the sanity check, otherwise I get some failures. Usually rerunning make test after a failed visual test runs into a sanity check failure. [That is on Intel 945 graphics hardware]
urlmon:protocol.c kernel32:thread.c
I don't rember these, but I most test runs I did were around 1.0, and these might have changed.
Is anybody else seeing this kind of flakiness? (If you're not, try running patchwatcher for a while :-)
Yes, I do, see above.
Regards, Michael Karcher
2008/8/12 Dan Kegel dank@kegel.com:
d3d9:visual.c
What kind of failures are you seeing there, and with which drivers? (The test should at least pass with recent Mesa versions as long as the GLSL extensions are disabled)
On Wed, Aug 13, 2008 at 12:18 AM, H. Verbeet hverbeet@gmail.com wrote:
d3d9:visual.c
What kind of failures are you seeing there, and with which drivers? (The test should at least pass with recent Mesa versions as long as the GLSL extensions are disabled)
I guess this is http://bugs.winehq.org/show_bug.cgi?id=10221
Here's the hardware and software (same as in that bug): $ lspci | grep -i vga 00:10.0 VGA compatible controller: nVidia Corporation GeForce 7100/nForce 630i (rev a2) $ cat /proc/driver/nvidia/version NVRM version: NVIDIA UNIX x86 Kernel Module 169.12 Thu Feb 14 17:53:07 PST 2008
I tried updating my nvidia driver, but it was a disaster; I may just have to wait until Intrepid comes out, since I failed the nvidia driver installation IQ test.
To repeat the failure, try
#!/bin/sh set -e set -x while true do rm *.ok make test done
in dlls/ddraw/tests. That fails in well under an hour for me with
../../../tools/runtest -q -P wine -M ddraw.dll -T ../../.. -p ddraw_test.exe.so visual.c && touch visual.ok fixme:win:EnumDisplayDevicesW ((null),0,0x32ed78,0x00000000), stub! fixme:d3d:WineD3D_ChoosePixelFormat Add OpenGL context recreation support to Set DepthStencilSurface fixme:d3d_draw:drawPrimitive Using software emulation because manual fog coordin ates are provided fixme:d3d:WineD3D_ChoosePixelFormat Add OpenGL context recreation support to Set DepthStencilSurface visual.c:1194: Test failed: Got color 00efebe7, expected 00000080 or near visual.c:1200: Test failed: Got color 00efebe7, expected 000000ff or near ... make: *** [visual.ok] Error 26
Annoyingly, this happens quite often under patchwatcher.
The d3d9 one is probably a driver bug, not sure about the ddraw one. If it's much of an issue, it might be worth to run the tests using Mesa instead.
I guess this is http://bugs.winehq.org/show_bug.cgi?id=10221
Ya, as Henri said this is most likely a driver bug and we can't do anything against it.
../../../tools/runtest -q -P wine -M ddraw.dll -T ../../.. -p ddraw_test.exe.so visual.c && touch visual.ok fixme:win:EnumDisplayDevicesW ((null),0,0x32ed78,0x00000000), stub! fixme:d3d:WineD3D_ChoosePixelFormat Add OpenGL context recreation support to Set DepthStencilSurface fixme:d3d_draw:drawPrimitive Using software emulation because manual fog coordin ates are provided fixme:d3d:WineD3D_ChoosePixelFormat Add OpenGL context recreation support to Set DepthStencilSurface visual.c:1194: Test failed: Got color 00efebe7, expected 00000080 or near visual.c:1200: Test failed: Got color 00efebe7, expected 000000ff or near
I have run the tests in valgrind, and there's a crash somewhere. Could be related
visual.c:1194: Test failed: Got color 00efebe7, expected 00000080 or near visual.c:1200: Test failed: Got color 00efebe7, expected 000000ff or near
I have run the tests in valgrind, and there's a crash somewhere. Could be related
It seems that the tests crash in the libglcore.so if run under valgrind(jump to an invalid location). The crash occurs in a glSecondaryColor3ubEXT call; The crash in the call doesn't make sense, because right before that a call with the same parameters succeeds, and it is the 2nd vertex in a triangle, so drawing shouldn't commence yet.
The ddraw visual test performs 3 independent tests: One using direct3d 7, one using direct3d 1, and one using direct3d 3. The crash occurs in the d3d1 test. If I comment out the d3d7 test, the d3d1 test works, and the crash occurs in the d3d3 test. After the d3d7 test the device, window etc are destroyed, and a new ones are created. So they should be isolated.
My guess is that something breaks in the driver when creating or releasing contexts. I can't tell if it is our fault or the driver's though. As far as I can see both the window and the context are really destroyed, so there's no refcounting bug hidden somewhere.
In order to work around the issue, we could put them into separate files; The issue is worth investigating though, but I have no idea what could be going on here.