View Single Post
Old 05-30-2012, 06:44 PM   #63
belaud
Revscene.net has a homepage?!
 
belaud's Avatar
 
Join Date: Feb 2012
Location: Richmond
Posts: 1,284
Thanked 2,093 Times in 452 Posts
Failed 75 Times in 29 Posts
Quote:
Originally Posted by dryjesus View Post
Yeah, but you still have to put into account when the video card IS on full load. If a computer pulls 450w at full load, you're not going to get a 400w and think "most of the time it won't be a 450w, so i'll just go with a little lower".

You should always have a good amount of headroom. As I said before, if you get close to the max wattage of your power supply, the fan will ramp up louder. And then you have capacitor aging, where the capacitors on your PSU degrades over time and the effective wattage of your power supply lowers.

Most power supplies usually have the best efficiency at around 50% load. Shorn's computer will probably idle at around 200~. It would be annoying if the PSU fan ramped up every time he opened something.

And if he ever decides to get a new video card that might use a little more power, he doesn't need to worry or wonder if his psu can handle it.

Don't get me wrong, you're probably fine with the 400w, but I would prefer a 500w to be worry free about upgrading and to avoid a loud fan.



Hope that's enough to convince you? You don't have to trust me... after all, it's just advice! You probably don't trust a guy with a 4 post count. I understand... :'(
Capacitor decay is very limited now, as 99.99999999% of the named power supplies are all solid japanesaru caps, however, decay happens immediately once they are ran outside their specified loads (450W on a 420W), this will cause immediate damage to the power supply.
belaud is offline   Reply With Quote