that SETI at home will ever find any intelligent signals and that it is mostly a waste of energy to look for them given our long distance
[. . .]
This is all for fun anyway. Probably SETI@home wastes less energy than running Doom3 on a high-end GPU :)
Besides, something being not likely is not an excuse for not trying. People win lotto, survive cancer and develop windows replacements from scratch, after all ;)
there's a difference, there *is* a chance to win lotto etc; even if you'd find ET with the SETI@home thing, it's completely useless; except for some people freaking out, it doesn't make any difference whatsoever; and chance of finding ET is zero too (finding ET? using radio signals? don't think so)
Chance of finding ET is zero if you believe it to be. We don't know any better than that. There are people who believe their chance of winning the lotto is zero as well. Go figure what people believe in . . .
why not spend idle computer time usefull if you are going to spend it?
To you it might not be useful, but someone else's viewpoint might differ. It's all a matter of opinion, after all. It's hard to be objective about it, because "usefullness" is not something you can really measure and have everyone agree on the measurement method :)
I've just tested, running seti at home (with "blank" screensaver active) takes about 3W less from the outlet than does running complex Doom3 battle scenes. That's on Linux running on some funky 6 month old Nvidia hardware, on similarly funky Shuttle mobo w/fastest PIV that would fit in there and 1GB of DDR. I used a fairly good wattmeter that I'm sure handles this PSU just fine.
one related question -- wouldn't running cpu hardware maximally reduce their lifetime?
If the hardware is correctly cooled, it shouldn't matter. It may be worse to subject the chips to thermal stresses when the temperature changes. I.e. turning seti@home on/off, and just turning the computer on/off might be worse than just keeping it on (or off), but with seti at home always enabled :).
There are two things that kill semiconductors (simplifying things a bit): migration of ions that's accelerated at higher temperatures, and cracking (and other mechanical defects, like metal delamination etc) due to thermal stresses. I'm not talking about ESD/overvoltage/overcurrent and handling issues here, just about what would affect a typical CPU/GPU inside of your PC, assuming everything else is correctly designed and works w/o problems. I'm also dismissing radiation-induced effects as negligible -- you're not roaming around Chernobyl, I hope.
Cheers, Kuba