Power Supply Electricity Consumption

infernal008

New member
Hello, I'm almost new on how some computer components work, especially the power supplies.

I would like to know if the wattage described in a power supply is the actual and constant amount of power it sucks from my power outlet (Ex. Wall outlet)?

For example, I have a 600watt power supply. Considering it is a branded one that complies with the 80+ bronze standard. Does this mean that when I plug the power supply in a power outlet it will constantly consume 600watts even though that my system is only on idle (CPU and GPU at idle)? Or powersupplies lessen the amount of power it sucks from the outlet depending on the needs of the system?

Thanks! :D
 
The PSU will feed only how much the connected components ask for. So if your system is idling, let's say 100W, then it will supply just that to your PC but it will take a bit more than 100W from the wall outlet because of the 20% inefficiency(in the case of 80+). Same goes for 100% system load.
 
The PSU will feed only how much the connected components ask for. So if your system is idling, let's say 100W, then it will supply just that to your PC but it will take a bit more than 100W from the wall outlet because of the 20% inefficiency(in the case of 80+). Same goes for 100% system load.

Hey Thanks! So the rating of the powersupply doen't matter in power consumption but instead it actually depends on the requirement of the components right? And the rating of the powersupplies are just more or less the range on how much it can supply power?
 
The rating determines how efficient a power supply is. If you have an 80+ PSU it will waste more energy turning electricity into useful forms for the computer than a 90+ would.

If a PSU draws 200w from the socket it won't provide 200w to the computer because some of it is wasted in the process (like lost in heat or spinning the fan).

A simple example - if your pc needs 200w to run at a given work load:
on a 80+ PSU it would draw = 250w from the wall
a 90+ PSU would only need to draw = 222w

You'll also find that PSUs efficiency will vary accross the range of watts it can supply. Often the packaging will have a graph on the side like this:

fmoVdtp.jpg


So running them at a sensible % gives you the best efficiency, keeps it quiet (the fan won't need to spin to keep the PSU cool) and also will make it last longer.
 
The rating determines how efficient a power supply is. If you have an 80+ PSU it will waste more energy turning electricity into useful forms for the computer than a 90+ would.

If a PSU draws 200w from the socket it won't provide 200w to the computer because some of it is wasted in the process (like lost in heat or spinning the fan).

A simple example - if your pc needs 200w to run at a given work load:
on a 80+ PSU it would draw = 250w from the wall
a 90+ PSU would only need to draw = 222w

You'll also find that PSUs efficiency will vary accross the range of watts it can supply. Often the packaging will have a graph on the side like this:

fmoVdtp.jpg


So running them at a sensible % gives you the best efficiency, keeps it quiet (the fan won't need to spin to keep the PSU cool) and also will make it last longer.

This is very informative. Thanks!

So the actual power draw from the socket depends on the factors of power requirement and psu efficiency right?

Relating to your example, if I have a psu capable of running a computed 425w (an example of computed rate considering efficieny) and has a load of 200w. Does this mean that my psu will draw power from the socket only at 200w as required by my components or my psu will draw 425w from the socket constantly and will provide 200w out of the 425w it gets to the components?

**Thanks!
 
So the actual power draw from the socket depends on the factors of power requirement and psu efficiency right?
The way to think of it is this, it goes for any other electrical system too.

If your components are drawing 500W, that's the output power of the PSU. If the efficiency is 90%, then the actual draw from the wall or the input power will be:
500/0.9 = 555.56W

What's burned internally (The actual PSU will get hot) is the difference between the two - 55.56W.
 
This is very informative. Thanks!

So the actual power draw from the socket depends on the factors of power requirement and psu efficiency right?

Relating to your example, if I have a psu capable of running a computed 425w (an example of computed rate considering efficieny) and has a load of 200w. Does this mean that my psu will draw power from the socket only at 200w as required by my components or my psu will draw 425w from the socket constantly and will provide 200w out of the 425w it gets to the components?

**Thanks!
The bit in bold is correct. It's like having a car with 425BHP. If you are driving at 30mph down a quiet street the car won't use all 425BHP, it will only need to use 50BHP.

If your computer only needs to draw 50W at idle then any PSU will only give it 50w (whilst drawing a bit more from the wall based on the rating as has already been discussed).
 
Thanks everyone for your answers! :D It has shed some light on me and finally I was able to understand how psu's consume electricity. :rock:
 
Back
Top