A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

why don't card manufacturers say how much power it uses?



 
 
Thread Tools Display Modes
  #1  
Old June 30th 13, 11:52 AM posted to alt.comp.periphs.videocards.nvidia
[email protected]
external usenet poster
 
Posts: 60
Default why don't card manufacturers say how much power it uses?

Is it the case that the GPU model and frequency and amount of RAM
will fix the power draw, regardless of brand name? Then I suppose
you could trawl through data sheets and figure something out.

One might make assumptions of maximum possible power if it is a
slot-powered card of PCI express version 1, 2 or 3.
And if it has no fan, then it would be more miserly.
  #2  
Old June 30th 13, 04:56 PM posted to alt.comp.periphs.videocards.nvidia
Paul
external usenet poster
 
Posts: 13,364
Default why don't card manufacturers say how much power it uses?

wrote:
Is it the case that the GPU model and frequency and amount of RAM
will fix the power draw, regardless of brand name? Then I suppose
you could trawl through data sheets and figure something out.

One might make assumptions of maximum possible power if it is a
slot-powered card of PCI express version 1, 2 or 3.
And if it has no fan, then it would be more miserly.


Nothing really fixes the draw, but we'll get to that
in a moment.

For an engineer, giving power estimates can be a CLM
(career limiting move). At my former employer, you
couldn't even get a department located physically
next to mine, to give out power estimates :-) That's
how close to the chest such things are held.

I can give an example. One group, designs a chip. They
tell our engineer, he will need a 6W power source.
Our engineer duly complies. One day, the brand new chip
(eng prototypes) are delivered. The chip actually
draws 9W. The onboard power supply collapses under
the load. There are many phone calls, labels such as
"idiot" are exchanged, and so on. And you can imagine,
some of the manager-to-manager phone calls, involve the
loudest yelling. Now the program is set back, because
the person who did the power converter, didn't plan
for this at all (and should have).

*******

What things can we observe as customers: Cards
come in classes. And the classes are an (imprecise)
admission of power draw. So you're right, we're not
completely in the dark. The physical design of the
card, is an admission of the power draw.

1) Card with no fan.
2) Card with fan, but no PCI Express cable. (12V @ 4.3A is
the largest slot power observed to date, for this class.
I call that a "50W card", in nice round numbers.)
3) Card with fan, and one PCI Express 2x3 cable.
4) Card with fan, and two PCI Express cable connections.
I gather at this point, we could be up around 225W.
But since I can't afford cards like this, I hardly care :-)

So those are card classes. You could use a copy of the
PCI Express power spec, to place a number against each
class. The slot power, is limited. And the 4.3A number,
is as brave as that particular engineer got. If you get
too close to the limit, maybe a few cards will burn
their users motherboard, and you don't want that.

*******

Say I'm the engineer at ATI, and I need to work out a power
estimate.

1) If I use a pathological test case, I can end up with
a power number so high, that the number is useless to
anyone. To the power converter designer, to the ODM,
and so on. Nobody wants this number. In a pathological case,
I make as many nodes toggle at once as possible. Like put an
alternating 0x0000 0xFFFF pattern on the internal busses,
to make them burn up. Or, run all the FP64 with dense
instructions (wall-to-wall code). There are lots of ways
to burn a GPU (or other chips for that matter).
2) I may realize the GPU needs power management, such as
active throttling, overload detection in the power
converter, and so on. Perhaps I use something like Furmark,
to trigger this level of current, and make sure we can
handle it. Maybe I turn down the clock on you, if the power
actually gets too high. The power converter design, is a
measure of what we let you get away with.
3) Games come in, slightly lower in power. Maybe I can set up
a simulation test case, while the GPU is still under design,
that models the node toggle rates seen when running a popular
benchmark (Crysis). Then, running the power estimate software
available to chip designers, the software will give a number
accurate to 10% (from cell library characterization). But,
with the large error bars surrounding the construction of
the test case in the first place (what version of Crysis?).

When all is said and done, lets say the number is 120W. Or
maybe as the GPU or card designer, we set the limit to 120W,
throttling as needed.

Now, if I go to the Xbitlabs site, where they do actual video
card measurements, my 120W estimated card is being measured as
70W. And guys like me, out on the Internet, are telling people
to plan for a 70W load. Which isn't strictly accurate. The
120W number, might be a 3 sigma tail estimate, to cover the
"worst case card". Maybe 30% of the cards draw 70W, but
a few stinkers go all the way to 120W. Since Xbitlabs measures
only a card or two in their lab, their measurement is not
statistically significant.

See what a mess this is ? And why nobody in their right mind
wants to tell you the power ? :-)

OK, so I tell you the power is 70W, the card has one cable
(third of the four classes), and I'm probably not too far off :-)
Chances are, there's enough over-estimation in the selection
of your power supply, that you or I will never know the
difference. Only the thermal result ("my computer case is
too hot"), remains as a potential issue.

Only when a user comes here, with a Shuttle with a 200W
power supply in it, are we in serious trouble as estimators.
For those, you *really* need to know your stuff. And the
user has to be prepared to send a video card back, if the
PSU shuts off :-)

*******

Xbitlabs has stopped measuring card power numbers.

Enjoy this last summary, as an indication of the measured
values of a few cards.

http://www.xbitlabs.com/articles/gra...0_3.html#sect0

Note the Geforce 210 drawing 8.7W while playing Crysis.
That's how you get away without a fan for cooling.

(No cables, no fan, Geforce 210, gutless)
http://images17.newegg.com/is/image/newegg/14-130-606-Z03?$S640W$

Of course, the Geforce 210 only runs Crysis at three frames
per second, so I guess that's no surprise that it doesn't
use much power doing so. Selecting the antialiasing levels
that they did, did not help matters by any stretch of the
imagination. A person who could only afford a 210, would
have turned off the antialiasing.

http://www.xbitlabs.com/articles/gra..._11.html#sect0

When Xbitlabs doesn't list your card, I go here. The power
numbers are "3-sigma crap", so don't get too carried away.

http://www.gpureview.com/GeForce-210-card-621.html

Max Power Draw: 30.5 W --- Ummm, OK... Sure...

See the size of the error bars involved ? Wetting a
finger and sticking it into the air, will do you as much good.
If the card really used 30.5W, that fanless heatsink is
going to be scorchingly hot.

HTH,
Paul
  #3  
Old July 1st 13, 04:27 AM posted to alt.comp.periphs.videocards.nvidia
Robert Miles[_2_]
external usenet poster
 
Posts: 20
Default why don't card manufacturers say how much power it uses?

On 6/30/2013 10:56 AM, Paul wrote:
wrote:
Is it the case that the GPU model and frequency and amount of RAM
will fix the power draw, regardless of brand name? Then I suppose
you could trawl through data sheets and figure something out.

One might make assumptions of maximum possible power if it is a
slot-powered card of PCI express version 1, 2 or 3.
And if it has no fan, then it would be more miserly.


Nothing really fixes the draw, but we'll get to that
in a moment.

For an engineer, giving power estimates can be a CLM
(career limiting move). At my former employer, you
couldn't even get a department located physically
next to mine, to give out power estimates :-) That's
how close to the chest such things are held.

I can give an example. One group, designs a chip. They
tell our engineer, he will need a 6W power source.
Our engineer duly complies. One day, the brand new chip
(eng prototypes) are delivered. The chip actually
draws 9W. The onboard power supply collapses under
the load. There are many phone calls, labels such as
"idiot" are exchanged, and so on. And you can imagine,
some of the manager-to-manager phone calls, involve the
loudest yelling. Now the program is set back, because
the person who did the power converter, didn't plan
for this at all (and should have).

*******

What things can we observe as customers: Cards
come in classes. And the classes are an (imprecise)
admission of power draw. So you're right, we're not
completely in the dark. The physical design of the
card, is an admission of the power draw.

1) Card with no fan.
2) Card with fan, but no PCI Express cable. (12V @ 4.3A is
the largest slot power observed to date, for this class.
I call that a "50W card", in nice round numbers.)
3) Card with fan, and one PCI Express 2x3 cable.
4) Card with fan, and two PCI Express cable connections.
I gather at this point, we could be up around 225W.
But since I can't afford cards like this, I hardly care :-)

So those are card classes. You could use a copy of the
PCI Express power spec, to place a number against each
class. The slot power, is limited. And the 4.3A number,
is as brave as that particular engineer got. If you get
too close to the limit, maybe a few cards will burn
their users motherboard, and you don't want that.

*******

Say I'm the engineer at ATI, and I need to work out a power
estimate.

1) If I use a pathological test case, I can end up with
a power number so high, that the number is useless to
anyone. To the power converter designer, to the ODM,
and so on. Nobody wants this number. In a pathological case,
I make as many nodes toggle at once as possible. Like put an
alternating 0x0000 0xFFFF pattern on the internal busses,
to make them burn up. Or, run all the FP64 with dense
instructions (wall-to-wall code). There are lots of ways
to burn a GPU (or other chips for that matter).
2) I may realize the GPU needs power management, such as
active throttling, overload detection in the power
converter, and so on. Perhaps I use something like Furmark,
to trigger this level of current, and make sure we can
handle it. Maybe I turn down the clock on you, if the power
actually gets too high. The power converter design, is a
measure of what we let you get away with.
3) Games come in, slightly lower in power. Maybe I can set up
a simulation test case, while the GPU is still under design,
that models the node toggle rates seen when running a popular
benchmark (Crysis). Then, running the power estimate software
available to chip designers, the software will give a number
accurate to 10% (from cell library characterization). But,
with the large error bars surrounding the construction of
the test case in the first place (what version of Crysis?).

When all is said and done, lets say the number is 120W. Or
maybe as the GPU or card designer, we set the limit to 120W,
throttling as needed.

Now, if I go to the Xbitlabs site, where they do actual video
card measurements, my 120W estimated card is being measured as
70W. And guys like me, out on the Internet, are telling people
to plan for a 70W load. Which isn't strictly accurate. The
120W number, might be a 3 sigma tail estimate, to cover the
"worst case card". Maybe 30% of the cards draw 70W, but
a few stinkers go all the way to 120W. Since Xbitlabs measures
only a card or two in their lab, their measurement is not
statistically significant.

See what a mess this is ? And why nobody in their right mind
wants to tell you the power ? :-)

OK, so I tell you the power is 70W, the card has one cable
(third of the four classes), and I'm probably not too far off :-)
Chances are, there's enough over-estimation in the selection
of your power supply, that you or I will never know the
difference. Only the thermal result ("my computer case is
too hot"), remains as a potential issue.

Only when a user comes here, with a Shuttle with a 200W
power supply in it, are we in serious trouble as estimators.
For those, you *really* need to know your stuff. And the
user has to be prepared to send a video card back, if the
PSU shuts off :-)

*******

Xbitlabs has stopped measuring card power numbers.

Enjoy this last summary, as an indication of the measured
values of a few cards.

http://www.xbitlabs.com/articles/gra...0_3.html#sect0


Note the Geforce 210 drawing 8.7W while playing Crysis.
That's how you get away without a fan for cooling.

(No cables, no fan, Geforce 210, gutless)
http://images17.newegg.com/is/image/newegg/14-130-606-Z03?$S640W$

Of course, the Geforce 210 only runs Crysis at three frames
per second, so I guess that's no surprise that it doesn't
use much power doing so. Selecting the antialiasing levels
that they did, did not help matters by any stretch of the
imagination. A person who could only afford a 210, would
have turned off the antialiasing.

http://www.xbitlabs.com/articles/gra..._11.html#sect0


When Xbitlabs doesn't list your card, I go here. The power
numbers are "3-sigma crap", so don't get too carried away.

http://www.gpureview.com/GeForce-210-card-621.html

Max Power Draw: 30.5 W --- Ummm, OK... Sure...

See the size of the error bars involved ? Wetting a
finger and sticking it into the air, will do you as much good.
If the card really used 30.5W, that fanless heatsink is
going to be scorchingly hot.

HTH,
Paul


Nvidia, at least, gives a recommended minimum power supply
rating for graphics cards based on their GPU chips, except
for some of the lower-end cards. Sometimes, power use ratings
for those cards as well. However, they make them hard to find;
you may have to read all their many web pages on those cards
to find where they put the power ratings.

http://www.nvidia.com/object/geforce_family.html

Hint - start looking for one of the recent high-end cards
first, get an idea of where they put that information; often
near the end of the specifications section. For example,
the GTX Titan card uses 250 watts for the card alone.

I've read of some graphics cards rated at using 300 watts
for the card alone (about the limit for many PC cases), but
don't remember for which cards.

A quick look over at the AMD/ATI site didn't find any
similar power information.

http://www.amd.com/US/PRODUCTS/DESKT...n-hd-6000.aspx

For the higher-power cards, try to get one where the card
has a fan and a fan casing designed so that fan blows the
card's hot air out of the computer's case.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Anyone using a PCI card to power a USB WD Passport drive that gets power from the buss? Metspitzer Homebuilt PC's 4 June 26th 10 11:13 AM
card manufacturers § Ati Videocards 2 April 24th 04 12:22 AM
Power supply manufacturers CJon General 2 November 25th 03 03:45 AM
power supply manufacturers CJon Homebuilt PC's 0 November 23rd 03 09:04 PM
Geforce FX5600 Ultra: Power Indicator reports card is not getting enough power. edding1992 Nvidia Videocards 0 August 17th 03 06:57 AM


All times are GMT +1. The time now is 10:07 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.