What Zedric said is not true. The larger power supplies waste more electricity even under no load. PSU efficiency drops drastically at lighter loads.
-Full load efficiency ~70% 350W losses = 105W, 500W losses = 150W
-Half load efficiency ~50% 350W losses = 87.5W, 500W losses = 125W
In both cases you use ~250W but you waste less electricity with the smaller supply.
Now the second order effects:
-In the winter this is not a big deal you get a room that is a little warmer.
You could readjust heat flow to take advantage of this but nobody likes to give up that warm room, lol.
-In the summer you have to remove the extra heat which means you use about 75% more electricity to get the heat out assuming an airconditioner COP of ~0.6.
Cost
The analysis gets really complicated if the computer is in a sleep mode so I'll do the case where you leave the comp on all the time and do folding in background.
With the additional waste of 50W, assuming a $0.10/kWhr electricity rate (it may be higher now with oil going from $30 to $45 a barrell (I try not to look at the bills they scare me).
- 24*50*365/1000 * 0.10 = $43.8 penalty per year for buying a 500W PSU
- Adding airconditioning costs for 4 months = $76.65 size penalty.
- Now for grins multiply that by 50 years = $3832.50 size penalty.
Check my sig. I run a 350W Antec for a reason.
PS The above analysis leaves out the environmental costs of increased air pollution, global warming, acid rain, depletion of oil reserves, etc.
PPS Take that $3832.50 and multiply it by 500 million computers and you see why Intel started an initiative for a new, more efficient power supply standard. Waste heat from computers is now a major environmental and economic issue.
Some additional info:
http://www6.tomshardware.com/howto/20030609/power_supplies-02.html