On Monday 24 January 2005 04:01, Ivan Leo Puoti wrote:
I think quite a few regressions get into releases, and while a few a tracked down some stay in the code for months if not years (I've heard of at least one game that worked some 2 years ago but doesn't now), while regression testing may seem trivial to a developer, it can be a challenging task for most users, especially if they are new to wine/Linux. So maybe we should recruit some people to get regression reports from users, find the patch that caused the regression, and report back to developers. Am I just to tired at time of writing or could this be a good idea?
Ivan.
Hi Ivan,
this idea sounds quite good, though I've had a different concept in mind, for let's say more "mature wine releases". Though the whole releasing methods would need changes. I was thinking of a ie. wine beta1-rc release. Now we wanna make sure no new regressions have been introduced - but how?
I thought of creating a global list of applications, which are known to be quite "tricky" and have been broken several times of the last years. Think of a big feature matrix showing all applications indicating their status "works" / "broken forever" / "broken since last release".
And a new wine release would only be given out when all of these apps have been tested. This would require a high amount a people with a real motivation in background and probably a much longer time until the release is "ready". Do you think this idea is "overengineered"?
Bye Bye Niko