Today I saw two similar projects related to OpenGL:
[1]:
glean is a suite of tools for evaluating the quality of an OpenGL implementation and diagnosing any problems that are discovered. glean also has the ability to compare two OpenGL implementations and highlight the differences between them.
It seems be having win32 port also.
[2]:
Piglit is a collection of automated tests for OpenGL implementations.
The goal of Piglit is to help improve the quality of open source OpenGL drivers by providing developers with a simple means to perform regression tests.
Current status is that the framework is working (though rough at the edges). It contains the Glean tests, some tests adapted from Mesa as well as some specific regression tests for certain bugs. HTML summaries can be generated (see below), including the ability to compare different test runs.
Could these be of any use for our graphic guys -- Stefan and co.?
Then there is PerceptualDiff utility I found some time ago [3]. Guessed, could it also usefull for finding visual regressions of Wine? Probably not, as it seems to be used for testing video codecs (but I may be wrong):
PerceptualDiff is an image comparison utility that makes use of a computational model of the human visual system to compare two images.
So why would I use a program to tell me if two images are similar if I can tell the difference myself by eyeballing it?
Well the utility of this program really shines in the context of QA of rendering algorithms.
During regression testing of a renderer, hundreds of images are generated from an older version of the renderer and are compared with a newer version of the renderer. This program drastically reduces the number of false positives (failures that are not actually failures) caused by differences in random number generation, OS or machine architecture differences. Also, you do not want a human looking at hundreds of images when you can get the computer to do it for you nightly on a cron job.
[1] http://glean.sourceforge.net/whatis.html [2] http://people.freedesktop.org/~nh/piglit/ [3] http://pdiff.sourceforge.net/
2009/9/17 Saulius Krasuckas saulius2@ar.fi.lt:
Today I saw two similar projects related to OpenGL:
[1]:
glean is a suite of tools for evaluating the quality of an OpenGL implementation and diagnosing any problems that are discovered. glean also has the ability to compare two OpenGL implementations and highlight the differences between them.
It seems be having win32 port also.
[2]:
Piglit is a collection of automated tests for OpenGL implementations.
The goal of Piglit is to help improve the quality of open source OpenGL drivers by providing developers with a simple means to perform regression tests.
Current status is that the framework is working (though rough at the edges). It contains the Glean tests, some tests adapted from Mesa as well as some specific regression tests for certain bugs. HTML summaries can be generated (see below), including the ability to compare different test runs.
Could these be of any use for our graphic guys -- Stefan and co.?
Well, they're mostly useful when you're maintaining an OpenGL driver. Mesa already uses these.
Then there is PerceptualDiff utility I found some time ago [3]. Guessed, could it also usefull for finding visual regressions of Wine? Probably not, as it seems to be used for testing video codecs (but I may be wrong):
Possibly, but it would have to be in the context of a larger framework like e.g. CxTest or Appinstall.
* On Thu, 17 Sep 2009, Henri Verbeet wrote:
- 2009/9/17 Saulius Krasuckas saulius2@ar.fi.lt:
Could these be of any use for our graphic guys -- Stefan and co.?
Well, they're mostly useful when you're maintaining an OpenGL driver. Mesa already uses these.
And what about seeing if our tests (vs Win drivers) aren't really broken?
For example one test-check fails with these OpenGL 1.[34].x and 2.[12].x drivers for the adapters:
2.1.8870 ATI Radeon HD 4200 2.0.0 Intel 965/963 Graphics Media Accelerator 1.4.1 GeForce4 MX 440/AGP/SSE 1.4.0 Intel 915GM 1.3.0 Intel Brookdale-G 1.3.4145 MOBILITY RADEON 7500 DDR x86/SSE2
opengl.c:328: Test failed: Sharing of display lists failed for a context which already shared lists before
But it doesn't fail on these:
3.1.0 GeForce 8600 GTS/PCI/SSE2 3.0.0 GeForce 9600M GT/PCI/SSE2 2.1.2 GeForce FX 5200/AGP/SSE2 2.1.2 GeForce 7300 LE/PCI/SSE2/3DNOW! 2.1.1 GeForce 8600M GS/PCI/SSE2 2.1.1 GeForce 8400M GS/PCI/SSE2 2.0 Chrom Chromium 1.9 1.5 Chrom Chromium 1.9 1.1.0 GDI Generic (old w9x ar virtual boxes)
Aren't you guys having hard time deciding whether this statement:
322 /* Test 3: Share display lists with a context which already shares display lists with another context. 323 * According to MSDN the second parameter cannot share any display lists but some buggy drivers might allow it */
is OK ? (No offence) I thought driver test suite would give a more thorough answer..
Then there is PerceptualDiff utility I found some time ago [3]. Guessed, could it also usefull for finding visual regressions of Wine? Probably not, as it seems to be used for testing video codecs (but I may be wrong):
Possibly, but it would have to be in the context of a larger framework like e.g. CxTest or Appinstall.
And what about D3D rendering discrepancies?
On Thu, Sep 17, 2009 at 3:16 PM, Henri Verbeet hverbeet@gmail.com wrote:
2009/9/17 Saulius Krasuckas saulius2@ar.fi.lt:
Then there is PerceptualDiff utility I found some time ago [3]. Guessed, could it also usefull for finding visual regressions of Wine? Probably not, as it seems to be used for testing video codecs (but I may be wrong):
Possibly, but it would have to be in the context of a larger framework like e.g. CxTest or Appinstall.
Would it actually be useful? I can take a look, but not if it's going to be a lot of effort for little gain...don't the d3d conformance tests already (mostly) handle this?
2009/9/18 Austin English austinenglish@gmail.com:
On Thu, Sep 17, 2009 at 3:16 PM, Henri Verbeet hverbeet@gmail.com wrote:
Possibly, but it would have to be in the context of a larger framework like e.g. CxTest or Appinstall.
Would it actually be useful? I can take a look, but not if it's going to be a lot of effort for little gain...don't the d3d conformance tests already (mostly) handle this?
I imagine it could potentially be used for testing actual games by taking screenshots at specific times during a timedemo and comparing those against a reference. I certainly imagine it would take quite a bit of effort to setup, the potential gain is probably harder to quantify. The d3d conformance tests work mostly fine, but they're API level tests.
On Thu, 17 Sep 2009, Saulius Krasuckas wrote: [...]
Could these be of any use for our graphic guys -- Stefan and co.?
It might be the other way around; that is maybe our graphics guys can help the Glean and Piglit developpers.
Quite often they discover some bug in OpenGL. But if neither Glean nor Piglit tests for these specific bugs then it's no wonder that one OpenGL driver or another gets it wrong. So they may have ideas of tests to add to Glean and Piglit.
A number of issues are with OpenGL extensions. I don't know what Glean's and Piglit's policies are with regards to those. Hopefully they test them too. Maybe, just maybe, OpenGL drivers developers will feel a bit less free to ignore a given OpenGL extension if it has a test suite.
Porting it to Mac OS X (either through Apple's X server or natively) would probably also expose a bunch of bugs.
All that would help us indirectly of course: through hopefully better quality OpenGL drivers. So one day they would only have to deal with D3D bugs and no longer with OpenGL ones (one can dream).
But it's more work in the short term. Also it all assumes that Glean and Piglit are actually used by driver developpers (though they could also be useful for reporting bugs).