Does higher wattage power supply equal higher electricity drain????

P

please

I'm debating to use either a 400 watt or 500 watt power supply for my
Athlon xp system. This is a name brand, amd aproved power supply. I know
the 400 watt is enough power for my system, but I was wondering if I were
to buy the 500 watt power supply instead would it drain my electricity than
the 400 watt one just because it is 100 watts more powerful? I want to
keep my electric bills down but would also like to have a power supply I
can use on a later system. So I guess simply I am asking does the higher
wattage in the power supplys affect how much electricity is drained?
Thanks!
 
P

philo

I'm debating to use either a 400 watt or 500 watt power supply for my
Athlon xp system. This is a name brand, amd aproved power supply. I know
the 400 watt is enough power for my system, but I was wondering if I were
to buy the 500 watt power supply instead would it drain my electricity than
the 400 watt one just because it is 100 watts more powerful? I want to
keep my electric bills down but would also like to have a power supply I
can use on a later system. So I guess simply I am asking does the higher
wattage in the power supplys affect how much electricity is drained?
Thanks!


a powersupply just delivers what's demanded of it...
so your machine will use *approximately* the same amount of power
with a 500 watt supply as it would with a 400 watt supply

to make any significant difference on your electric bill
 
L

Lane Lewis

I'm debating to use either a 400 watt or 500 watt power supply for my
Athlon xp system. This is a name brand, amd aproved power supply. I know
the 400 watt is enough power for my system, but I was wondering if I were
to buy the 500 watt power supply instead would it drain my electricity than
the 400 watt one just because it is 100 watts more powerful? I want to
keep my electric bills down but would also like to have a power supply I
can use on a later system. So I guess simply I am asking does the higher
wattage in the power supplys affect how much electricity is drained?
Thanks!

Maybe a buck or two a month from heat losses of the larger componants in a
500 watt. AC units would also be taxed slighty but all in all insignificant.
It would make a real difference in comparing a 1000watt PSU to a 400.

Lane
 
K

kony

I'm debating to use either a 400 watt or 500 watt power supply for my
Athlon xp system. This is a name brand, amd aproved power supply. I know
the 400 watt is enough power for my system, but I was wondering if I were
to buy the 500 watt power supply instead would it drain my electricity than
the 400 watt one just because it is 100 watts more powerful? I want to
keep my electric bills down but would also like to have a power supply I
can use on a later system. So I guess simply I am asking does the higher
wattage in the power supplys affect how much electricity is drained?


No.
 
R

ric

Lane said:
Maybe a buck or two a month from heat losses of the larger componants in a
500 watt.

LOL.

In reality, the 500w would have, given identical design, larger components
and heatsinks and thus run cooler (running at the same output wattage as
the 400w), and have *less* heat losses than the 400w PSU.
 
L

Lane Lewis

ric said:
LOL.

In reality, the 500w would have, given identical design, larger components
and heatsinks and thus run cooler (running at the same output wattage as
the 400w), and have *less* heat losses than the 400w PSU.

Nope. Given that niether is overloaded and supplying the same amount of
power the 500 watt will use more. Imagine a 50 watt draw on a 100w PSU and a
50 watt draw on a 1000watt PSU. The thousand watt has larger componants and
will draw more. This is one of the reasons for matching up componants. There
is not much difference in a 400 and 500 watt, but the idea holds true.

Lane
 
P

philo

Nope. Given that niether is overloaded and supplying the same amount of
power the 500 watt will use more. Imagine a 50 watt draw on a 100w PSU and a
50 watt draw on a 1000watt PSU. The thousand watt has larger componants and
will draw more. This is one of the reasons for matching up componants. There
is not much difference in a 400 and 500 watt, but the idea holds true.



hold on a minute...

if , for example you have a 1 watt , 500 ohm resistor
and replace it with a 5 watt, 500 ohm resistor
the 5 watt resistor will *not* draw any more power than the 1 watt


so if you have a supply with components rated at higher power...
it does not necessarily mean the supply will draw any more power


*however* there are variations in overall efficiency depending on how much
of the rated output a psu is actually producing.


typically supplies operate most efficiently at the higher end of their
overall
rating so there is some truth that a 150 watt supply operating at 100
watts
for example could be more efficient than a 1000 watt supply operating at
100
watts but it is not really due to the power ratings of the components
per se
it is more due to the powerfactor and characteristics of the transformer
plus the high-frequency conversion circuitry itself

this whole thing can get quite complicated but suffice it to say that
there would be no significant difference in electrical costs to operate a
400
vs a 500 watt supply
 
L

Lane Lewis

philo said:
and



hold on a minute...

if , for example you have a 1 watt , 500 ohm resistor
and replace it with a 5 watt, 500 ohm resistor
the 5 watt resistor will *not* draw any more power than the 1 watt


so if you have a supply with components rated at higher power...
it does not necessarily mean the supply will draw any more power


*however* there are variations in overall efficiency depending on how much
of the rated output a psu is actually producing.


typically supplies operate most efficiently at the higher end of their
overall
rating so there is some truth that a 150 watt supply operating at 100
watts
for example could be more efficient than a 1000 watt supply operating at
100
watts but it is not really due to the power ratings of the components
per se
it is more due to the powerfactor and characteristics of the transformer
plus the high-frequency conversion circuitry itself

this whole thing can get quite complicated but suffice it to say that
there would be no significant difference in electrical costs to operate a
400
vs a 500 watt supply

Additional heavier circuitry, larger mosfets and other components, all use
more power when turned on. It gives you more power availibilty but at a
price. I don't think my statement of a buck or two a month is out of line.
Easily checked though with an AC ammeter.

Lane
 
K

kony

Additional heavier circuitry, larger mosfets and other components, all use
more power when turned on. It gives you more power availibilty but at a
price. I don't think my statement of a buck or two a month is out of line.
Easily checked though with an AC ammeter.


Many of the larger PSU manufacturers provide efficiency specs, they
should be helpful in determining the power usage.

Generally larger components will not consume more power. The only
time a switching power supplie's efficiency is significantly increased
is if powering a known, fixed load, allowing tuning. Otherwise the
difference isn't worth consideration, especially not when comparing a
400W and 500W PSU from same manufacturer, which if made for the same
target/use, the two may be quite similar if not almost identical
inside, and in efficiency.



Dave
 
R

ric

Lane said:
Nope. Given that niether is overloaded and supplying the same amount of
power the 500 watt will use more. Imagine a 50 watt draw on a 100w PSU and a
50 watt draw on a 1000watt PSU. The thousand watt has larger componants and
will draw more. This is one of the reasons for matching up componants. There
is not much difference in a 400 and 500 watt, but the idea holds true.

Your statement "The thousand watt unit has larger componants [sic] and
will draw more" has no basis in fact. Upon which law of electronics is this
based?
 
P

philo

i've read some of your posts here before and you seem like a sensible
person...however you are quite wrong here...
just because a component is "larger" does not mean it will draw more
current...
please do a little reading on basic electronics before you embarrass
yourself any further
thank you
 
G

Guest

I'm debating to use either a 400 watt or 500 watt power supply
for my Athlon xp system. This is a name brand, amd aproved power
supply.

Everything has a name brand, and almost everything is AMD approved,
even some of the worst power supplies you can find, like
Leadman/Powmax/Raidmax and the numerous Deer brands, such as Foxconn,
Codegen, Allied, L&C, etc. Some of the better retail brands are
Fortron (Sparkle, Powerman, Aopen) and Antec TruePower.
Manufacturer's power ratings vary greatly in accuracy, and some 500W
supplies have been found to be weaker than another 400W and even 350W
supplies.
I know the 400 watt is enough power for my system, but I was
wondering if I were to buy the 500 watt power supply instead
would it drain my electricity than the 400 watt one

Switching mode power supplies maintain about the same efficiency,
regardless of load, except at extremes (low and high extremes), so a
500W supply won't draw more power than a 400W. One person here said
the opposite, but he's completely wrong and doesn't seem to understand
enough about electronics because the transistors and diodes used in
more powerful supplies have the same or lower on-resistance or voltage
drop ratings, so they should be just as efficient. The only things
that may make a more powerful supply less efficient are extra fans
since fans use power, too.
 
G

Guest

Lane Lewis said:
Additional heavier circuitry, larger mosfets and other components,
all use more power when turned on. It gives you more power
availibilty but at a price.

Why would larger MOSFETs need more power? If anything, they have
lower on-resistance and therefore should be more efficient, as do the
diodes used in the output section (forward voltage drop). The only
things that takes more power at turn-on are the high voltage filter
capacitors, but that's only for about a second and when the back panel
or power strip surge is turned on, and there's a thermistor in series
with the capacitors to minimize that current.

So exactly what causes worse efficiency when a 500W supply instead of
a 400W supply is connected to a typical 200W computer. There's only
one thing I can think of: any additional fans on the larger supply.

Please look at the effiency measurements done for various power
supplies at www.silentpcreview.com .
 
P

philo

do_not_spam_me said:
"Lane Lewis" <[email protected]> wrote in message

Why would larger MOSFETs need more power? If anything, they have
lower on-resistance and therefore should be more efficient, as do the
diodes used in the output section (forward voltage drop). The only
things that takes more power at turn-on are the high voltage filter
capacitors, but that's only for about a second and when the back panel
or power strip surge is turned on, and there's a thermistor in series
with the capacitors to minimize that current.

thanks for the additional clarification

a silicon junction has a voltage drop (text book) of 0.7 volts

the larger the device (viz: size of the silicon wafer, physical package &
heat sink)...the more heat it can dissapate...however a silicon junction
is a silicon junction and the voltage drop across it will be the same.

an add'l point...concerning heat

if (for example) a 1 watt 500 ohm resistor was replaced with a 10 watt, 500
ohm resistor the total amount of heat disappated will be exactly the same

the reason the larger resistor feels cooler is simply a matter of the same
amount
of heat being disappated over a larger surface area
 
P

philo

Switching mode power supplies maintain about the same efficiency,
regardless of load, except at extremes (low and high extremes), so a
500W supply won't draw more power than a 400W. One person here said
the opposite, but he's completely wrong and doesn't seem to understand
enough about electronics because the transistors and diodes used in
more powerful supplies have the same or lower on-resistance or voltage
drop ratings, so they should be just as efficient. The only things
that may make a more powerful supply less efficient are extra fans
since fans use power, too.

hello:

just for laughs i decided to go down into my workshop and run a few
tests since i have a 150watt and a 300watt supply down there

i loaded them down (one at a time) with the same two harddrives...
presenting a load low enough to probably be in the range of the psu's
worst efficiency

the 300 watt supply actually did draw about 0.05a more than the 150 watt
supply. ( i used a regulated 115v.a.c. supply)
being that they were not of the same design and had different cooling fans
i think that the 50ma could be considered insignificant...
especially considering that if in use 24/7 it would be a 15cent difference
in electric bill *per year*
 
V

V W Wall

philo said:
hello:

just for laughs i decided to go down into my workshop and run a few
tests since i have a 150watt and a 300watt supply down there

i loaded them down (one at a time) with the same two harddrives...
presenting a load low enough to probably be in the range of the psu's
worst efficiency

the 300 watt supply actually did draw about 0.05a more than the 150 watt
supply. ( i used a regulated 115v.a.c. supply)
being that they were not of the same design and had different cooling fans
i think that the 50ma could be considered insignificant...
especially considering that if in use 24/7 it would be a 15cent difference
in electric bill *per year*

Great to see someone actually *measure* something rather than argue about it!

Virg Wall
 
P

philo

Great to see someone actually *measure* something rather than argue about it!

well i;ve got all the tools to do it
since i;ve been a service engineer in the field of power deliver for 29
years now
although after all that time...it;s still amazing what i don't know...
so i figured it would not hurt to actually run a test...

i carry all my test equipment from work in my van at all times
and it is required that it pass a calibration test once a year...
of course at the range i was using the ammeter it probably was not 100%
accurate...but i think it was close enough

the equipment i work on is actually industrial batteries and battery
chargers
....but even many of the battery chargers today in industrail use are of the
high-frequency conversion type...so not really too different in priciple
from the computer power supplies ...other than that they are three phase
and rated at up to 12 kilowatts !
 
V

V W Wall

philo said:
well i;ve got all the tools to do it
since i;ve been a service engineer in the field of power deliver for 29
years now
although after all that time...it;s still amazing what i don't know...
so i figured it would not hurt to actually run a test...

i carry all my test equipment from work in my van at all times
and it is required that it pass a calibration test once a year...
of course at the range i was using the ammeter it probably was not 100%
accurate...but i think it was close enough

I've got a PC line cord with a 2 ohm resistor in series with the hot line.
(2 ohms because I could select 5-10 ohm resistors to get a precise 2 ohms)
I use an inexpensive DVM to measure voltage drop and voltage to PS to
calculate power usage. Probably good to +/- 5%, but better than guessing.
the equipment i work on is actually industrial batteries and battery
chargers
...but even many of the battery chargers today in industrail use are of the
high-frequency conversion type...so not really too different in priciple
from the computer power supplies ...other than that they are three phase
and rated at up to 12 kilowatts !

You're probably old enough to remember the old mercury vapor rectifiers.
I worked on a few chargers that used magnetic amplifiers as control units.
Thank God, or Bell Labs, (my first employer), for solid state devices!

As a radar magnetron designer, I worked with supplies at that many kilo*volts*.
They put out megawatts, but only for a few microseconds several thousand times
a second.

Messing around with the line supply voltage is probably the most dangerous
part of computer work. It's not even present in the case with ATX supplies.

Virg Wall
 
P

philo

I've got a PC line cord with a 2 ohm resistor in series with the hot line.
(2 ohms because I could select 5-10 ohm resistors to get a precise 2 ohms)
I use an inexpensive DVM to measure voltage drop and voltage to PS to
calculate power usage. Probably good to +/- 5%, but better than guessing.

well i was using a fluke clamp-on with resolution down to 0.1a

i got my 0.05a reading because the display was fluctuation slowly
and steadily by ± 0.1a ...since it normally holds steady... that was a
pretty
good way of interpolating my resultant reading
You're probably old enough to remember the old mercury vapor rectifiers.

yep...i sure do remember those good old mercury vapor rectifiers...
we used them in some of our ham radio transmitters...
the first time i used such a transmitter...the first time i hit the transmit
switch i dropped the mic. the blue flash startled me to say the least!
I worked on a few chargers that used magnetic amplifiers as control units.
Thank God, or Bell Labs, (my first employer), for solid state devices!

oh i still see a few of those magetic amplifier (and saturable reactor)
chargers out there...and plenty of ferroresonant transformers too

but for the most part it;s scr but hi-freq. conversion seems to be the
next
up and comming thing
As a radar magnetron designer, I worked with supplies at that many kilo*volts*.
They put out megawatts, but only for a few microseconds several thousand times
a second.
well in the 29 year i;ve been doing this...i only shorted out a battery one
time
but it was one time too many! fortunately i was wearing all my safety
equipment...and the battery was fused...however it was a 1000 amp fuse i
blew!
my socket set was a little bit melted !!!!!
Messing around with the line supply voltage is probably the most dangerous
part of computer work. It's not even present in the case with ATX supplies.
safer than industrial batteries i think :)


btw: i have a nice automotive battery charger that is from the 1920's

it uses a 6 amp tube rectifier...

the filament accounts for about half of the unit;s total current draw!
it still works fine...but not too efficiently

philo
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top