http://yokozar.org/blog/archives/48 is a fun little look at using simulation to see how various strategies might affect Wine development. The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user. - Dan
Dan Kegel wrote:
http://yokozar.org/blog/archives/48 is a fun little look at using simulation to see how various strategies might affect Wine development. The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
- Dan
Thank you Dan, you reminded me to forward my blog post to the list ;)
Coincidentally, I found an essay by Paul Graham where he says a very similar thing: http://www.paulgraham.com/13sentences.html
-- Better to make a few users love you than a lot ambivalent.
Ideally you want to make large numbers of users love you, but you can't expect to hit that right away. Initially you have to choose between satisfying all the needs of a subset of potential users, or satisfying a subset of the needs of all potential users. Take the first. It's easier to expand userwise than satisfactionwise. And perhaps more importantly, it's harder to lie to yourself. If you think you're 85% of the way to a great product, how do you know it's not 70%? Or 10%? Whereas it's easy to know how many users you have. --
There's a lot to learn from the model I made, and it's an easy to modify python script that you can just run in the background while doing real work. For instance, you can test a strategy that tries to maximize collateral damage (working on bugs that affect the most apps) against a strategy that tries to maximize the effect of specific damage (working on almost working apps). It turns out the collateral damage strategy isn't very good - you fix a few bugs in a lot of apps, but most will remain broken due to some small problem that hardly affects anything else.
Thanks, Scott Ritchie
On Fri, Apr 17, 2009 at 8:38 PM, Scott Ritchie scott@open-vote.org wrote:
http://yokozar.org/blog/archives/48 is a fun little look at using simulation to see how various strategies might affect Wine development.
Thank you Dan, you reminded me to forward my blog post to the list ;)
YokoZar, eh? No wonder I didn't notice it was you :-)
2009/4/18 Scott Ritchie scott@open-vote.org:
Thank you Dan, you reminded me to forward my blog post to the list ;)
I'm not sure how to put this into your simulation as described, but there's another effect that's important: the good-enough-to-be-beta effect.
I'd say there was a significant upturn in Wine's quality around 0.9. That's where Wine crossed from being an interesting idea into something good enough to actually use - where it was good enough for actual users, so more users meant more bug reports. Yay bug reports!
I got a similar feeling around Mozilla 0.9 - where this fat, lumbering, bug-riddled, crash-prone browser that was nevertheless very important crossed some line and ... was more usable than not. I believe it was the stability push between 0.8.1 and 0.9 that did that.
Another important comment on your post links to an idea Wine needs: crash reporting. Just like Windows does.
https://winqual.microsoft.com/help/About_Windows_Error_Reporting_for_Hardwar... http://www.codinghorror.com/blog/archives/001239.html
Note that the latter post advocates this for the Wine development model of fixing bugs as they're a problem: "Although I remain a fan of test driven development, the speculative nature of the time investment is one problem I've always had with it. If you fix a bug that no actual user will ever encounter, what have you actually fixed? While there are many other valid reasons to practice TDD, as a pure bug fixing mechanism it's always seemed far too much like premature optimization for my tastes. I'd much rather spend my time fixing bugs that are problems in practice rather than theory."
I wonder how much work crash reporting would be to add to Wine.
The importance of automatic reporting, of course, is that if you rely on your users to complain actively then you've already lost.
- d.
- d.
On Saturday 18 April 2009 05:21:20 Dan Kegel wrote:
http://yokozar.org/blog/archives/48 is a fun little look at using simulation to see how various strategies might affect Wine development.
Interesting, but largely academic.
The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
The problem seems to be identifying these people. The model assumes that you can tell which piece of software almost works, and that you know the almost happy users. In reality, you only seem to hear from the pretty unhappy users and the occasional really happy user. Susan Cragin is about the only user I can think of off the top of my head who's almost happy and could be made completely happy by fixing all of the remaining bugs in DNS.
Also, reality has us deal with the fact that new applications are added while we're working on the old ones, and looking at the graphs, we're only going to make a significant number of users happy when we're about 98% done fixing the bugs. I realize that it's a bit hard to model "rate of new applications with new bugs being added", but that's what happens in real life.
Cheers, Kai
2009/4/18 Kai Blin kai.blin@gmail.com:
On Saturday 18 April 2009 05:21:20 Dan Kegel wrote:
The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
The problem seems to be identifying these people. The model assumes that you can tell which piece of software almost works, and that you know the almost happy users. In reality, you only seem to hear from the pretty unhappy users and the occasional really happy user. Susan Cragin is about the only user I can think of off the top of my head who's almost happy and could be made completely happy by fixing all of the remaining bugs in DNS.
You could also pick some games as well. A lot of the Oberon Media casual games work (probably around 50-60%), but there are bugs in the launch page that means you don't get a seamless experience. The Game Socks versions of the games have a seamless experience with the splash loader (not sure about purchasing games there, though).
Other popular games and games platforms like WoW and Steam will also help your gamer users.
CodeWeavers are doing this to some extent - now branching out to fix other applications. There was a big push a while back to get Photoshop usable.
AppDB or something similar is useful for determining the most popular applications that people are using. This does not cover the users, though -- a user may be happy with one app, but unhappy with another because that one is obscure/unpopular (think in-house applications).
Also, reality has us deal with the fact that new applications are added while we're working on the old ones, and looking at the graphs, we're only going to make a significant number of users happy when we're about 98% done fixing the bugs. I realize that it's a bit hard to model "rate of new applications with new bugs being added", but that's what happens in real life.
Not just new applications, but upgrades as well. iTunes is a constantly shifting landmark, from what I understand. IE6, 7 and 8 use more of the Windows API. Photoshop only works with earlier versions. There have been updates to fix Office 2007 SP1 issues.
Some of the major pain points I can see in the future for getting Wine to run applications are: 1. applications that use the newer Windows Vista and 7 APIs -- this is ok if the application is also designed to run on XP or earlier, but applications will start targetting XP and later or Vista and later. 2. applications that use .NET, WinForms and/or WPF -- these should be ok on Wine if the Microsoft and/or Mono runtimes can support these. 3. applications that start using more (unimplemented or partially implemented) of the XP and earlier APIs.
So a happy user today could be an unhappy user tomorrow if they try and upgrade one of their applications. But such is the nature of playing continual catchup.
- Reece
Reece Dunn wrote:
2009/4/18 Kai Blin kai.blin@gmail.com:
On Saturday 18 April 2009 05:21:20 Dan Kegel wrote:
The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
The problem seems to be identifying these people. The model assumes that you can tell which piece of software almost works, and that you know the almost happy users. In reality, you only seem to hear from the pretty unhappy users and the occasional really happy user. Susan Cragin is about the only user I can think of off the top of my head who's almost happy and could be made completely happy by fixing all of the remaining bugs in DNS.
You could also pick some games as well. A lot of the Oberon Media casual games work (probably around 50-60%), but there are bugs in the launch page that means you don't get a seamless experience. The Game Socks versions of the games have a seamless experience with the splash loader (not sure about purchasing games there, though).
Other popular games and games platforms like WoW and Steam will also help your gamer users.
The one trouble with gamer users is that they tend to have a lot of games that they want working. Probably the major exception is WoW -- and it's a very wise choice for us to make sure it works really well, since many users want that and only that.
CodeWeavers are doing this to some extent - now branching out to fix other applications. There was a big push a while back to get Photoshop usable.
AppDB or something similar is useful for determining the most popular applications that people are using. This does not cover the users, though -- a user may be happy with one app, but unhappy with another because that one is obscure/unpopular (think in-house applications).
Also, reality has us deal with the fact that new applications are added while we're working on the old ones, and looking at the graphs, we're only going to make a significant number of users happy when we're about 98% done fixing the bugs. I realize that it's a bit hard to model "rate of new applications with new bugs being added", but that's what happens in real life.
Not just new applications, but upgrades as well. iTunes is a constantly shifting landmark, from what I understand. IE6, 7 and 8 use more of the Windows API. Photoshop only works with earlier versions. There have been updates to fix Office 2007 SP1 issues.
Some of the major pain points I can see in the future for getting Wine to run applications are:
- applications that use the newer Windows Vista and 7 APIs -- this
is ok if the application is also designed to run on XP or earlier, but applications will start targetting XP and later or Vista and later. 2. applications that use .NET, WinForms and/or WPF -- these should be ok on Wine if the Microsoft and/or Mono runtimes can support these. 3. applications that start using more (unimplemented or partially implemented) of the XP and earlier APIs.
So a happy user today could be an unhappy user tomorrow if they try and upgrade one of their applications. But such is the nature of playing continual catchup.
Perhaps this implies we shouldn't try so hard to get the apps that are about to break on upgrade, even if they are big ones. We've spent a lot of effort chasing Photoshop, but if only the Photoshop of three years ago works then we haven't really gained much directly.
Thanks, Scott Ritchie
Scott Ritchie wrote:
Reece Dunn wrote:
2009/4/18 Kai Blin kai.blin@gmail.com:
On Saturday 18 April 2009 05:21:20 Dan Kegel wrote:
The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
The problem seems to be identifying these people. The model assumes that you can tell which piece of software almost works, and that you know the almost happy users. In reality, you only seem to hear from the pretty unhappy users and the occasional really happy user. Susan Cragin is about the only user I can think of off the top of my head who's almost happy and could be made completely happy by fixing all of the remaining bugs in DNS.
You could also pick some games as well. A lot of the Oberon Media casual games work (probably around 50-60%), but there are bugs in the launch page that means you don't get a seamless experience. The Game Socks versions of the games have a seamless experience with the splash loader (not sure about purchasing games there, though).
Other popular games and games platforms like WoW and Steam will also help your gamer users.
The one trouble with gamer users is that they tend to have a lot of games that they want working. Probably the major exception is WoW -- and it's a very wise choice for us to make sure it works really well, since many users want that and only that.
CodeWeavers are doing this to some extent - now branching out to fix other applications. There was a big push a while back to get Photoshop usable.
AppDB or something similar is useful for determining the most popular applications that people are using. This does not cover the users, though -- a user may be happy with one app, but unhappy with another because that one is obscure/unpopular (think in-house applications).
Also, reality has us deal with the fact that new applications are added while we're working on the old ones, and looking at the graphs, we're only going to make a significant number of users happy when we're about 98% done fixing the bugs. I realize that it's a bit hard to model "rate of new applications with new bugs being added", but that's what happens in real life.
Not just new applications, but upgrades as well. iTunes is a constantly shifting landmark, from what I understand. IE6, 7 and 8 use more of the Windows API. Photoshop only works with earlier versions. There have been updates to fix Office 2007 SP1 issues.
Some of the major pain points I can see in the future for getting Wine to run applications are:
- applications that use the newer Windows Vista and 7 APIs -- this
is ok if the application is also designed to run on XP or earlier, but applications will start targetting XP and later or Vista and later. 2. applications that use .NET, WinForms and/or WPF -- these should be ok on Wine if the Microsoft and/or Mono runtimes can support these. 3. applications that start using more (unimplemented or partially implemented) of the XP and earlier APIs.
So a happy user today could be an unhappy user tomorrow if they try and upgrade one of their applications. But such is the nature of playing continual catchup.
Perhaps this implies we shouldn't try so hard to get the apps that are about to break on upgrade, even if they are big ones. We've spent a lot of effort chasing Photoshop, but if only the Photoshop of three years ago works then we haven't really gained much directly.
There are big bugs and smaller ones. If an app is in the high 90's for working and there is an active use of it in wine, smaller programs and utilities come to mind, then concentrating on those would seem to have a bigger satisfaction payback. All the little glitches and problems are magnified as the users are expecting that if it is not a word or a photoshop then it should be small enough to work in wine. Another way of looking at it is that these bugs and lack of functionality need to addressed sometime and if the effort is put in early then we do start to get happy users.
Jeff
Kai Blin wrote:
On Saturday 18 April 2009 05:21:20 Dan Kegel wrote:
http://yokozar.org/blog/archives/48 is a fun little look at using simulation to see how various strategies might affect Wine development.
Interesting, but largely academic.
Fair enough. The fact that growth in applications working and happy users is roughly exponential is interesting though, and I think that reflects reality.
The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
The problem seems to be identifying these people. The model assumes that you can tell which piece of software almost works, and that you know the almost happy users. In reality, you only seem to hear from the pretty unhappy users and the occasional really happy user. Susan Cragin is about the only user I can think of off the top of my head who's almost happy and could be made completely happy by fixing all of the remaining bugs in DNS.
Well, she seems like a nice enough person, why not pick her ;)
But, yes, it's not very helpful when identifying which bugs are affecting a particular app is half the work (and then identifying how that API is supposed to work is most of what's left). Still, it's nice to know that, when we do know, it's not a bad stategy to just go ahead with it rather than work on something else.
Also, reality has us deal with the fact that new applications are added while we're working on the old ones, and looking at the graphs, we're only going to make a significant number of users happy when we're about 98% done fixing the bugs. I realize that it's a bit hard to model "rate of new applications with new bugs being added", but that's what happens in real life.
I don't think it's too inaccurate if we imagine the start of the model being today rather than 16 years ago when the project got started. So that way we don't have to quite worry about the moving target so much.
Thanks, Scott Ritchie
On Sat, Apr 18, 2009 at 1:23 AM, Kai Blin kai.blin@gmail.com wrote:
The one that worked out best was to pick some random user who's almost happy, fix the last few bugs that are keeping his apps from working, and then once he's happy, move on to the next such user.
The problem seems to be identifying these people. The model assumes that you can tell which piece of software almost works, and that you know the almost happy users. In reality, you only seem to hear from the pretty unhappy users and the occasional really happy user.
I suspect that one can come pretty close by fixing one bug in each application that a particular complaining user uses. If that makes the app work (or if it seems clear that the app is really, really close to gold), bingo, you've found one. If not, you can move on without having wasted too much time.
Photoshop CS2/CS3 was kind of a special case. It was the #1 requested application, it was a convenient rallying point, and I think having it work as well as it does now is a significant point in Wine's favor. Plus a number of the fixes benefitted other apps (especially Adobe apps). There were other big thrusts during the same time period (MSI and gdiplus, for example) that were more broadly helpful. Plus all the while I was looking for individual users with complaints, triaging their bugs, and trying to make them happy. So we were really following three strategies at once: 1) fix the most popular app 2) fix key components that many apps need 3) find individual users and make them happy I think it was a good mix. - Dan