The cost should only increase if you use more wattage. A 480W doesn't use more than a 350W unless the computer acctually needs more than 350W, which I doubt.
Then there's the difference in efficiency between different brands of PSUs which makes them draw different wattage from the mains on a given load. But that's impossible to judge just from maximum load specs.
What Zedric said is not true. The larger power supplies waste more electricity even under no load. PSU efficiency drops drastically at lighter loads.
-Full load efficiency ~70% 350W losses = 105W, 500W losses = 150W
-Half load efficiency ~50% 350W losses = 87.5W, 500W losses = 125W
In both cases you use ~250W but you waste less electricity with the smaller supply.
Now the second order effects:
-In the winter this is not a big deal you get a room that is a little warmer. You could readjust heat flow to take advantage of this but nobody likes to give up that warm room, lol.
-In the summer you have to remove the extra heat which means you use about 75% more electricity to get the heat out assuming an airconditioner COP of ~0.6.
The analysis gets really complicated if the computer is in a sleep mode so I'll do the case where you leave the comp on all the time and do folding in background.
With the additional waste of 50W, assuming a $0.10/kWhr electricity rate (it may be higher now with oil going from $30 to $45 a barrell (I try not to look at the bills they scare me).
- 24*50*365/1000 * 0.10 = $43.8 penalty per year for buying a 500W PSU
- Adding airconditioning costs for 4 months = $76.65 size penalty.
- Now for grins multiply that by 50 years = $3832.50 size penalty.
Check my sig. I run a 350W Antec for a reason.
PS The above analysis leaves out the environmental costs of increased air pollution, global warming, acid rain, depletion of oil reserves, etc.
PPS Take that $3832.50 and multiply it by 500 million computers and you see why Intel started an initiative for a new, more efficient power supply standard. Waste heat from computers is now a major environmental and economic issue.
Power supply losses come in 3 varieties.
-Fixed losses (what it takes to control the supply with no load)
-IsqrdR losses which are proportional to square of load current.
-Switching losses are partly related to load current & directly to operating frequency.
With bigger supplies the Fixed losses go up and because they make them as cheap as they can the Switching losses go up also. Since both of those are not heavily affected by the load on the supply you pay a penalty at full load and an even bigger penalty at light load.
You can make a bigger supply have lower losses than a smaller one but that drives up the cost, weight and volume. So generally the better (i.e. more expensive supplies) waste less power.
Are you sorry you brought it up yet?
Back to the cost of use question (that was what this thread was originally about right...).
Considering that going with a big, inexpensive supply can cost you $40 a year it is smarter to pay the money to get a better quality, more efficient supply. It will more than pay for itself over it's 3 year operational life.
Zedric see how smart you were picking that Forton/Source PS. Good desicion making.
All of this said, I wouldn't want to run a PSU at 100% load, or even close. It was awhile ago and I'd have to dig the material back up, but I remember that about 70%-85% load (somewhere in there) is ideal. It has been a few years however.
I've got a 400 watt PSU in this system (which I had since I got an Athlon 700 back in 1999 or so). It's been through a few upgrades, but I wouldn't consider putting an A64 on this thing. I also have 2 Seagate Cheetahs (10x rpm SCSI drives), 7 or 8 fans, 3 CD/DVD drives (including a burner), and a bunch of other stuff in here.
Still, in terms of overall power usage my computer isn't that bad and I leave it running 24/7. The real problem is the monitor and it's power usage. If it leave it running all month, the power bill is like $44 a month, and about $32 a month if I turn it off when it's not in use in my appt. My monitor's a 19" crt. The CRT is where one can really see the draw on one's power...
Well I can say I never gave the electrical cost of my computer being on much thought, I run an Antec 480W PSU and my system stays on 24/7 (must find the aliens [SETI] you know) but the cost is relitive to where you live and your electrical costs. My electric here in Green Bay, WI is about $35-$50 a month depending if it is summer or winter, and I changed all the lights to super high efficency floresent light bulbs. Yet my house in CA the electric is about $125 a month for similar useage, that is a guess as I have many lights for reptile cages that can run anytime durring the day or night depending on temp.
My guess would be that it costs me here in WI about $3.00 a month to run my computer, but I turn off my monitor when I am not using it, and it costs about $10.00 a month in CA.