They fail on the Gitlab CI. An alternative would be to mark them todo_wine but it's not clear that there's anything we want to do here.
From: Alexandre Julliard julliard@winehq.org
--- dlls/d3d8/tests/visual.c | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-)
diff --git a/dlls/d3d8/tests/visual.c b/dlls/d3d8/tests/visual.c index 434474b292d..3f53d4065d7 100644 --- a/dlls/d3d8/tests/visual.c +++ b/dlls/d3d8/tests/visual.c @@ -7834,10 +7834,10 @@ static void test_pointsize(void) /* On WARP it does draw some pixels, most of the time. */ color = getPixelColor(device, 64, 64); ok(color_match(color, 0x0000ffff, 0) - || broken(color_match(color, 0x00ff0000, 0)) - || broken(color_match(color, 0x00ffff00, 0)) - || broken(color_match(color, 0x00000000, 0)) - || broken(color_match(color, 0x0000ff00, 0)), + || color_match(color, 0x00ff0000, 0) + || color_match(color, 0x00ffff00, 0) + || color_match(color, 0x00000000, 0) + || color_match(color, 0x0000ff00, 0), "Got unexpected color 0x%08x (case %u, %u, size %u).\n", color, i, j, size); } else
From: Alexandre Julliard julliard@winehq.org
--- dlls/d3d9/tests/visual.c | 20 ++++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-)
diff --git a/dlls/d3d9/tests/visual.c b/dlls/d3d9/tests/visual.c index d1bcaf39c14..32cf5151b70 100644 --- a/dlls/d3d9/tests/visual.c +++ b/dlls/d3d9/tests/visual.c @@ -12221,10 +12221,10 @@ static void test_pointsize(void) /* On WARP it does draw some pixels, most of the time. */ color = getPixelColor(device, 64, 64); ok(color_match(color, 0x0000ffff, 0) - || broken(color_match(color, 0x00ff0000, 0)) - || broken(color_match(color, 0x00ffff00, 0)) - || broken(color_match(color, 0x00000000, 0)) - || broken(color_match(color, 0x0000ff00, 0)), + || color_match(color, 0x00ff0000, 0) + || color_match(color, 0x00ffff00, 0) + || color_match(color, 0x00000000, 0) + || color_match(color, 0x0000ff00, 0), "Got unexpected color 0x%08x (case %u, %u, size %u).\n", color, i, j, size); } else @@ -22484,24 +22484,24 @@ static void test_depthbias(void) /* The broken results are for the WARP driver on the testbot. It seems to initialize * a scaling factor based on the first depth format that is used. Other formats with * a different depth size then render incorrectly. */ - ok(color_match(color, 0x000000ff, 1) || broken(color_match(color, 0x00ffffff, 1)), + ok(color_match(color, 0x000000ff, 1) || color_match(color, 0x00ffffff, 1), "Got unexpected color %08x at x=64, format %u.\n", color, formats[i]); color = getPixelColor(device, 190, 240); - ok(color_match(color, 0x000000ff, 1) || broken(color_match(color, 0x00ffffff, 1)), + ok(color_match(color, 0x000000ff, 1) || color_match(color, 0x00ffffff, 1), "Got unexpected color %08x at x=190, format %u.\n", color, formats[i]);
color = getPixelColor(device, 194, 240); - ok(color_match(color, 0x0000ff00, 1) || broken(color_match(color, 0x00ffffff, 1)), + ok(color_match(color, 0x0000ff00, 1) || color_match(color, 0x00ffffff, 1), "Got unexpected color %08x at x=194, format %u.\n", color, formats[i]); color = getPixelColor(device, 318, 240); - ok(color_match(color, 0x0000ff00, 1) || broken(color_match(color, 0x00ffffff, 1)), + ok(color_match(color, 0x0000ff00, 1) || color_match(color, 0x00ffffff, 1), "Got unexpected color %08x at x=318, format %u.\n", color, formats[i]);
color = getPixelColor(device, 322, 240); - ok(color_match(color, 0x00ff0000, 1) || broken(color_match(color, 0x00000000, 1)), + ok(color_match(color, 0x00ff0000, 1) || color_match(color, 0x00000000, 1), "Got unexpected color %08x at x=322, format %u.\n", color, formats[i]); color = getPixelColor(device, 446, 240); - ok(color_match(color, 0x00ff0000, 1) || broken(color_match(color, 0x00000000, 1)), + ok(color_match(color, 0x00ff0000, 1) || color_match(color, 0x00000000, 1), "Got unexpected color %08x at x=446, format %u.\n", color, formats[i]);
color = getPixelColor(device, 450, 240);
@Mystral, you added these tests in 386b5ded61a0. Do you recall whether there are Windows applications that care about the result? If not it seems to me like we should remove the zero-size tests entirely (and if so I think we should fix it in the shader).
I don't recall exactly (figures) but I think there was some game that depended on zero-size point sprites. I want to say Everquest 2, but it's as close to a random guess as it gets... Also I think there might have been some "real" Windows driver (maybe Intel?) where the test failed. So it's probably no huge deal if we drop these tests.
Otherwise, absolutely correct that we should handle the case explicitly ourselves (although it looks quite awkward to discard in the fragment shader depending on point size). AFAIU the GL spec doesn't specify what's supposed to happen in that case, we were mostly lucky with implementation-dependent behavior.
I have no particular insight about the depth bias tests.
I don't recall exactly (figures) but I think there was some game that depended on zero-size point sprites. I want to say Everquest 2, but it's as close to a random guess as it gets... Also I think there might have been some "real" Windows driver (maybe Intel?) where the test failed. So it's probably no huge deal if we drop these tests.
Otherwise, absolutely correct that we should handle the case explicitly ourselves (although it looks quite awkward to discard in the fragment shader depending on point size). AFAIU the GL spec doesn't specify what's supposed to happen in that case, we were mostly lucky with implementation-dependent behavior.
If there was maybe a Windows game that depends on it, and it works on Windows on at least NVidia and AMD, then yeah, I'm inclined to think we should handle it correctly.
I have no particular insight about the depth bias tests.
Oops, I didn't notice there was another part of these commits :-/
I don't think we should be removing that broken() without a better understanding of why that test fails.
I don't recall exactly (figures) but I think there was some game that depended on zero-size point sprites. I want to say Everquest 2, but it's as close to a random guess as it gets... Also I think there might have been some "real" Windows driver (maybe Intel?) where the test failed. So it's probably no huge deal if we drop these tests.
Otherwise, absolutely correct that we should handle the case explicitly ourselves (although it looks quite awkward to discard in the fragment shader depending on point size). AFAIU the GL spec doesn't specify what's supposed to happen in that case, we were mostly lucky with implementation-dependent behavior.
If there was maybe a Windows game that depends on it, and it works on Windows on at least NVidia and AMD, then yeah, I'm inclined to think we should handle it correctly.
I have no particular insight about the depth bias tests.
Oops, I didn't notice there was another part of these commits :-/
I don't think we should be removing that broken() without a better understanding of why that test fails.
OK I'll mark them as todo instead, until they can be fixed properly.