Hi,
While I'm also busy with getting dll information on the page, I'm still adding stuff.
This next iteration will add todo information on the pages.
Current situation:
http://test.winehq.org/data/200708221000/
There are several todo's for the Wine test but this is not shown on the page.
New situation:
http://www.xs4all.nl/~pvriens/200708221000_new/
The main difference is the yellow border on the left for some of the Wine tests (for the group shown in the Main Summary and individually at the Group 'Wine differences').
I deliberately didn't add all the todo tests on the Group level (as is done for the skips) as todo have more to do with tests/implementations than the actual system running the test.
If there are no objections, I'll sent the necessary patches in a few days.
Bit off topic: One thing that struck me was the difference in test results for the Wine runs. This has most likely to do with old Wine installations running new tests. I will add the Wine version to the infrastructure soon so that these outcomes make more sense.
Hello,
Paul Vriens wrote:
While I'm also busy with getting dll information on the page, I'm still adding stuff.
This next iteration will add todo information on the pages.
Current situation:
http://test.winehq.org/data/200708221000/
There are several todo's for the Wine test but this is not shown on the page.
New situation:
http://www.xs4all.nl/~pvriens/200708221000_new/
The main difference is the yellow border on the left for some of the Wine tests (for the group shown in the Main Summary and individually at the Group 'Wine differences').
dumb question: how do you see if one of the "Some tests fail in some reports" has some todos too? Would be orange a better color for the left border? I picked orange only as it is a mix of yellow and red (in the subtractive color mixing </nitpick>); not sure if somebody would get eye cancer from looking at the result.
I deliberately didn't add all the todo tests on the Group level (as is done for the skips) as todo have more to do with tests/implementations than the actual system running the test.
If there are no objections, I'll sent the necessary patches in a few days.
Bit off topic: One thing that struck me was the difference in test results for the Wine runs. This has most likely to do with old Wine installations running new tests. I will add the Wine version to the infrastructure soon so that these outcomes make more sense.
bye michael
Michael Stefaniuc wrote:
Hello,
Paul Vriens wrote:
While I'm also busy with getting dll information on the page, I'm still adding stuff.
This next iteration will add todo information on the pages.
Current situation:
http://test.winehq.org/data/200708221000/
There are several todo's for the Wine test but this is not shown on the page.
New situation:
http://www.xs4all.nl/~pvriens/200708221000_new/
The main difference is the yellow border on the left for some of the Wine tests (for the group shown in the Main Summary and individually at the Group 'Wine differences').
dumb question: how do you see if one of the "Some tests fail in some reports" has some todos too? Would be orange a better color for the left border? I picked orange only as it is a mix of yellow and red (in the subtractive color mixing </nitpick>); not sure if somebody would get eye cancer from looking at the result.
Hi Michael
Thanks for taking to time to have a look. The answer is, you can't with yellow. I've experimented with some colors and went back to yellow all the time. (This mixed results AND todo is only for Wine results by the way.)
Here's the one with orange:
http://www.xs4all.nl/~pvriens/200708221000_new_orange
the old one is now:
http://www.xs4all.nl/~pvriens/200708221000_new_yellow/
I just hope we get more reports in in the future otherwise we will never have mixed results ;-).
I'm also going to change the text in the legend, "some tests need some work" to something like "implementation need some work".
Cheers,
Paul
Bit off topic: One thing that struck me was the difference in test results for the Wine runs. This has most likely to do with old Wine installations running new tests. I will add the Wine version to the infrastructure soon so that these outcomes make more sense.
In the same vein of thinking perhaps there should be a registry key or file that records what version of wine created the .wine directory and that information should get recorded into the tests as well.
-John Klehm
John Klehm wrote:
Bit off topic: One thing that struck me was the difference in test results for the Wine runs. This has most likely to do with old Wine installations running new tests. I will add the Wine version to the infrastructure soon so that these outcomes make more sense.
In the same vein of thinking perhaps there should be a registry key or file that records what version of wine created the .wine directory and that information should get recorded into the tests as well.
-John Klehm
The idea was to use the output of "wine --version" for this. When I said infrastructure I meant 'test results'.
Paul Vriens wrote:
John Klehm wrote:
Bit off topic: One thing that struck me was the difference in test results for the Wine runs. This has most likely to do with old Wine installations running new tests. I will add the Wine version to the infrastructure soon so that these outcomes make more sense.
In the same vein of thinking perhaps there should be a registry key or file that records what version of wine created the .wine directory and that information should get recorded into the tests as well.
-John Klehm
The idea was to use the output of "wine --version" for this. When I said infrastructure I meant 'test results'.
Hi,
Just to show you the opposite as well:
http://test.winehq.org/data/200708241000/wine_2000_0.9.43-431-g0a485b3/cabin...
Here can you see that winetest still has the old tests whereas my box was updated to the latest GIT just an hour ago.