This is a lil bit off topic, but I thought I would bring it up, since we have users that are trying to run windows BOINC under wine.
The wine project has a SETI@Home team (has had for some time), so if you run the windows or linux BOINC/SETI@Home client, you can join our team and apply your credits towards the project's total. Just go to http://setiathome.berkeley.edu/team_display.php?teamid=38091 and click the Join link.
Tom
Just thought that I would throw out the point that it isn't likely that SETI at home will ever find any intelligent signals and that it is mostly a waste of energy to look for them given our long distance to nearby galaxies and planets. A more productive use of the same cpu time is likely to be protein folding, something that can be applied to science in the near term.
Chris
On 4/14/06, Tom Spear speeddymon@gmail.com wrote:
This is a lil bit off topic, but I thought I would bring it up, since we have users that are trying to run windows BOINC under wine.
The wine project has a SETI@Home team (has had for some time), so if you run the windows or linux BOINC/SETI@Home client, you can join our team and apply your credits towards the project's total. Just go to http://setiathome.berkeley.edu/team_display.php?teamid=38091 and click the Join link.
Tom
On Fri, 2006-04-14 at 15:25 -0400, Chris Morgan wrote:
Just thought that I would throw out the point that it isn't likely that SETI at home will ever find any intelligent signals
What about the wow signal?
I think "isn't likely" is a bit pessimistic, but the probability that an intelligent species which is more advanced than us using radio is very unlikely, they'd probably use gravity waves, or something else which is faster. I mean, I know if I was part of a space fairing species that google may come in handy from time to time, so speedy internet links without a time to live in the region of hundreds of years would be a must have.
I still think its a waste of CPU time for my machine, but I wouldn't discourage others from joining. Same as I wouldn't discourage users from joining any boinc project.
K,
On Friday 14 April 2006 15:25, Chris Morgan wrote:
Just thought that I would throw out the point that it isn't likely that SETI at home will ever find any intelligent signals and that it is mostly a waste of energy to look for them given our long distance to nearby galaxies and planets. A more productive use of the same cpu time is likely to be protein folding, something that can be applied to science in the near term.
This is all for fun anyway. Probably SETI@home wastes less energy than running Doom3 on a high-end GPU :)
Besides, something being not likely is not an excuse for not trying. People win lotto, survive cancer and develop windows replacements from scratch, after all ;)
Cheers, Kuba
Kuba Ober wrote:
On Friday 14 April 2006 15:25, Chris Morgan wrote:
Just thought that I would throw out the point that it isn't likely that SETI at home will ever find any intelligent signals and that it is mostly a waste of energy to look for them given our long distance to nearby galaxies and planets. A more productive use of the same cpu time is likely to be protein folding, something that can be applied to science in the near term.
This is all for fun anyway. Probably SETI@home wastes less energy than running Doom3 on a high-end GPU :)
Besides, something being not likely is not an excuse for not trying. People win lotto, survive cancer and develop windows replacements from scratch, after all ;)
Cheers, Kuba
there's a difference, there *is* a chance to win lotto etc; even if you'd find ET with the SETI@home thing, it's completely useless; except for some people freaking out, it doesn't make any difference whatsoever; and chance of finding ET is zero too (finding ET? using radio signals? don't think so) why not spend idle computer time usefull if you are going to spend it?
one related question -- wouldn't running cpu hardware maximally reduce their lifetime?
regards,
Joris
that SETI at home will ever find any intelligent signals and that it is mostly a waste of energy to look for them given our long distance
[. . .]
This is all for fun anyway. Probably SETI@home wastes less energy than running Doom3 on a high-end GPU :)
Besides, something being not likely is not an excuse for not trying. People win lotto, survive cancer and develop windows replacements from scratch, after all ;)
there's a difference, there *is* a chance to win lotto etc; even if you'd find ET with the SETI@home thing, it's completely useless; except for some people freaking out, it doesn't make any difference whatsoever; and chance of finding ET is zero too (finding ET? using radio signals? don't think so)
Chance of finding ET is zero if you believe it to be. We don't know any better than that. There are people who believe their chance of winning the lotto is zero as well. Go figure what people believe in . . .
why not spend idle computer time usefull if you are going to spend it?
To you it might not be useful, but someone else's viewpoint might differ. It's all a matter of opinion, after all. It's hard to be objective about it, because "usefullness" is not something you can really measure and have everyone agree on the measurement method :)
I've just tested, running seti at home (with "blank" screensaver active) takes about 3W less from the outlet than does running complex Doom3 battle scenes. That's on Linux running on some funky 6 month old Nvidia hardware, on similarly funky Shuttle mobo w/fastest PIV that would fit in there and 1GB of DDR. I used a fairly good wattmeter that I'm sure handles this PSU just fine.
one related question -- wouldn't running cpu hardware maximally reduce their lifetime?
If the hardware is correctly cooled, it shouldn't matter. It may be worse to subject the chips to thermal stresses when the temperature changes. I.e. turning seti@home on/off, and just turning the computer on/off might be worse than just keeping it on (or off), but with seti at home always enabled :).
There are two things that kill semiconductors (simplifying things a bit): migration of ions that's accelerated at higher temperatures, and cracking (and other mechanical defects, like metal delamination etc) due to thermal stresses. I'm not talking about ESD/overvoltage/overcurrent and handling issues here, just about what would affect a typical CPU/GPU inside of your PC, assuming everything else is correctly designed and works w/o problems. I'm also dismissing radiation-induced effects as negligible -- you're not roaming around Chernobyl, I hope.
Cheers, Kuba