A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Lower Power Utilization for High End Video Card?



 
 
Thread Tools Display Modes
  #1  
Old November 25th 12, 12:51 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

I have an older XP computer in a living room on which I installed an nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?

--
W


  #2  
Old November 25th 12, 06:38 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

W wrote:
I have an older XP computer in a living room on which I installed an nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

Paul
  #3  
Old November 25th 12, 09:21 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Tom
external usenet poster
 
Posts: 6
Default Lower Power Utilization for High End Video Card?

Hey Paul... do you have a web site that may collect your learned answers? I
certainly get a lot from your explanations.

T2

"Paul" wrote in message ...

W wrote:
I have an older XP computer in a living room on which I installed an
nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an
inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual
machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

Paul

  #4  
Old November 26th 12, 01:02 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an

nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with

the
monitor turned off and the computer doing nothing but displaying an

inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual

machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a

minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.


It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the system
power use when no video card is installed to get any kind of proxy for power
used by the video card alone?

This article:


http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go into
an idle mode that uses 3 watts. Effectively the card turns itself off:

"Better still, the excellent ZeroCore Power feature gives a 16% reduction in
energy consumption at idle and allows you to turn the card's fan off. For
this, the computer has to be configured so that it switches the screen off
after a given period of time. As soon as the screen goes on standby, the
card is almost entirely switched off and only consumes 3 watts of power,
bringing the overall consumption of our test computer down to 74 watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?

--
W


  #5  
Old November 26th 12, 01:19 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

Tom wrote:
Hey Paul... do you have a web site that may collect your learned
answers? I certainly get a lot from your explanations.

T2


Google Groups archives the contents of the news groups.

Most of the info I gather, is "out there". It's available
on enthusiast sites, where occasionally someone from the
factory might mention some of this stuff. In cases like
with Intel redesigning their silicon, there have been
articles in the public domain about that. (Intel took
a lot more chances during its evolution than AMD did.
Intel turned their transistors "upside-down" for example,
when they redid their smaller geometry processes. AMD has
one tenth the staff, and can't afford that level of
research.)

I have experience at a silicon fab, but that's back
in the days when leakage current was precisely "zero".
So my experience doesn't count for anything. My old fab
is gone now, and a drug company uses the building.
Anything silicon related has long since been thrown away.

This is the article I was looking for earlier, but
couldn't find it again. Some per-rail power
measurements from 2010. Some of the cards have
pretty low power, like the HD 5450 at 3.2 watts (idle)
and the Geforce 210 at 3.9 watts (idle). The problem
now, is getting an article of this quality, in the
year 2012.

http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The HD 5970 there, is 44 watts (idle) and 240.7 watts (3D_max).
So that's like a factor of 5 between the two.
(I don't count OCCT, as it's one of a few synthetic tests
that I wouldn't normally run here. In fact, some graphics
drivers have features to detect things like OCCT or Furmark,
and detune things so the card doesn't get damaged.)

http://www.generation-gpu.fr/UserImg...D5870/OCCT.jpg

So if a person can stand the crappy performance of a low-end
card (for gaming), their idle power is exceptionally low.
Cards like my old 9800 Pro, might be around 35 watts by
comparison. Your room isn't going to get very
warm, with a 3.2 watt card.

Paul
  #6  
Old November 26th 12, 05:12 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an

nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with

the
monitor turned off and the computer doing nothing but displaying an

inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual

machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a

minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.


It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the system
power use when no video card is installed to get any kind of proxy for power
used by the video card alone?

This article:


http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go into
an idle mode that uses 3 watts. Effectively the card turns itself off:

"Better still, the excellent ZeroCore Power feature gives a 16% reduction in
energy consumption at idle and allows you to turn the card's fan off. For
this, the computer has to be configured so that it switches the screen off
after a given period of time. As soon as the screen goes on standby, the
card is almost entirely switched off and only consumes 3 watts of power,
bringing the overall consumption of our test computer down to 74 watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse around.
These are not system power numbers, these are video card only, measured with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.

http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.

Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.

Paul
  #7  
Old November 26th 12, 08:58 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

"Paul" wrote in message
...
W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an

nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with

the
monitor turned off and the computer doing nothing but displaying an

inactive
Windows desktop, the nVidia card is consuming about 160 watts of

energy
continuously. Since the system is only used to run a few virtual

machines
about 99% of the time, that is a lot of wasted energy. I want a

card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a

minimum
power utilization mode when the card is not being used heavily? I

read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details

on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.


It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the

system
power use when no video card is installed to get any kind of proxy for

power
used by the video card alone?

This article:



http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go

into
an idle mode that uses 3 watts. Effectively the card turns itself off:

"Better still, the excellent ZeroCore Power feature gives a 16%

reduction in
energy consumption at idle and allows you to turn the card's fan off.

For
this, the computer has to be configured so that it switches the screen

off
after a given period of time. As soon as the screen goes on standby, the
card is almost entirely switched off and only consumes 3 watts of power,
bringing the overall consumption of our test computer down to 74 watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse around.
These are not system power numbers, these are video card only, measured

with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they

exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.


http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in

those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.


If I believe the AMD web site, the RADEON ZeroCore power technology will
put the video card into a sleep state that uses less than 4 watts while the
system is running with the screen off. Good or bad, my computer will act
as a server and the system will not sleep. But the screen will be resting
99% of the time and during that rest time I want to minimize the power draw.

What is the most powerful AMD video card that fully implements the ZeroCore
technology today?


Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.


It is not that unusual for a computer to act like a server and run virtual
machines in the background. In my case those run a home active directory
and some other administrative servers.

--
W


  #8  
Old November 26th 12, 06:12 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Homer Jay Simpson[_3_]
external usenet poster
 
Posts: 26
Default Lower Power Utilization for High End Video Card?

"W" wrote in message
...
"Paul" wrote in message
...
W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an
nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with
the
monitor turned off and the computer doing nothing but displaying an
inactive
Windows desktop, the nVidia card is consuming about 160 watts of

energy
continuously. Since the system is only used to run a few virtual
machines
about 99% of the time, that is a lot of wasted energy. I want a

card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a
minimum
power utilization mode when the card is not being used heavily? I

read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details

on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the

system
power use when no video card is installed to get any kind of proxy for

power
used by the video card alone?

This article:



http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go

into
an idle mode that uses 3 watts. Effectively the card turns itself
off:

"Better still, the excellent ZeroCore Power feature gives a 16%

reduction in
energy consumption at idle and allows you to turn the card's fan off.

For
this, the computer has to be configured so that it switches the screen

off
after a given period of time. As soon as the screen goes on standby,
the
card is almost entirely switched off and only consumes 3 watts of
power,
bringing the overall consumption of our test computer down to 74
watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for
the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse around.
These are not system power numbers, these are video card only, measured

with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they

exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.


http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in

those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.


If I believe the AMD web site, the RADEON ZeroCore power technology will
put the video card into a sleep state that uses less than 4 watts while
the
system is running with the screen off. Good or bad, my computer will act
as a server and the system will not sleep. But the screen will be resting
99% of the time and during that rest time I want to minimize the power
draw.

What is the most powerful AMD video card that fully implements the
ZeroCore
technology today?


Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.


It is not that unusual for a computer to act like a server and run virtual
machines in the background. In my case those run a home active directory
and some other administrative servers.

--
W


On AMD's web page for the AMD Radeon HD 7970 GHz Edition:

http://www.amd.com/us/products/deskt...7970GHz.aspx#3

AMD ZeroCore Power technology*
. Ultra-low idle power when the system's display is off
. Efficient low power mode for desktop work
. Secondary GPUs in an AMD CrossFireT technology configuration power down
when unneeded

* AMD PowerPlayT, AMD PowerTune and AMD ZeroCore Power are technologies
offered by certain AMD RadeonT products, which are designed to intelligently
manage GPU power consumption in response to certain GPU load conditions.
Not all products feature all technologies - check with your component or
system manufacturer for specific model capabilities.

It seems to be up to the add-in board partner whether or not they want to
implement the feature.



  #9  
Old November 27th 12, 02:11 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

"Homer Jay Simpson" wrote in message
...
"W" wrote in message
...
"Paul" wrote in message
...
W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed

an
nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief

with
the
monitor turned off and the computer doing nothing but displaying an
inactive
Windows desktop, the nVidia card is consuming about 160 watts of

energy
continuously. Since the system is only used to run a few virtual
machines
about 99% of the time, that is a lot of wasted energy. I want a

card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a
minimum
power utilization mode when the card is not being used heavily? I

read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are

details
on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

It's not clear what the data you report means if it is a measurement

of
total power used by the system. You would have to subtract out the

system
power use when no video card is installed to get any kind of proxy

for
power
used by the video card alone?

This article:




http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can

go
into
an idle mode that uses 3 watts. Effectively the card turns itself
off:

"Better still, the excellent ZeroCore Power feature gives a 16%

reduction in
energy consumption at idle and allows you to turn the card's fan off.

For
this, the computer has to be configured so that it switches the

screen
off
after a given period of time. As soon as the screen goes on standby,
the
card is almost entirely switched off and only consumes 3 watts of
power,
bringing the overall consumption of our test computer down to 74
watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for
the
power card* and when the system is in idle state. 3 watts versus

160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse

around.
These are not system power numbers, these are video card only, measured

with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if

they
exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.



http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in

those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.


If I believe the AMD web site, the RADEON ZeroCore power technology

will
put the video card into a sleep state that uses less than 4 watts while
the
system is running with the screen off. Good or bad, my computer will

act
as a server and the system will not sleep. But the screen will be

resting
99% of the time and during that rest time I want to minimize the power
draw.

What is the most powerful AMD video card that fully implements the
ZeroCore
technology today?


Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.


It is not that unusual for a computer to act like a server and run

virtual
machines in the background. In my case those run a home active

directory
and some other administrative servers.

--
W


On AMD's web page for the AMD Radeon HD 7970 GHz Edition:


http://www.amd.com/us/products/deskt...7970GHz.aspx#3

AMD ZeroCore Power technology*
. Ultra-low idle power when the system's display is off
. Efficient low power mode for desktop work
. Secondary GPUs in an AMD CrossFireT technology configuration power down
when unneeded

* AMD PowerPlayT, AMD PowerTune and AMD ZeroCore Power are technologies
offered by certain AMD RadeonT products, which are designed to

intelligently
manage GPU power consumption in response to certain GPU load conditions.
Not all products feature all technologies - check with your component or
system manufacturer for specific model capabilities.

It seems to be up to the add-in board partner whether or not they want to
implement the feature.


Right. Which leads back to my question: what is the most power AMD video
card that fully implements the ZeroCore technology today?

I want a card that is in the top 10% of performance and that uses under 4
watts when video is in idle.

--
W


  #10  
Old November 27th 12, 09:51 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

W wrote:
"Homer Jay Simpson" wrote in message
...


It seems to be up to the add-in board partner whether or not they want to
implement the feature.


Right. Which leads back to my question: what is the most power AMD video
card that fully implements the ZeroCore technology today?

I want a card that is in the top 10% of performance and that uses under 4
watts when video is in idle.


Most of the designs done out there, use information from a
reference implementation. Video card designers, just don't
run amok by themselves. They need lots of help.

If you needed to turn off the core power, all you need is a
core switching regulator, with a "zero volts" VID setting.
As current video cards, send a VID code to the regulator.
A zero setting, would be translated by the regulator, as a
request to turn off. This was done years ago, on CPU VCore
regulators, when the VID lines are in a floating state (goes off).
So it would all depend, on whether the regulators used (like Volterra),
support a feature like that. The rest of the support, comes
from the design of the GPU itself (like, separate power planes
for the appropriate subsystems, as it would be profitable to
maintain some state information while in ZeroCore state - you
need to drive the VID lines for example).

No regular website is going to be measuring the ZeroCore condition.
(And since Xbitlabs "got lazy", they'd have been the best technically
equipped to do such work. But they don't have the motherboard any more.)
I'd never heard of ZeroCore until you mentioned it. It requires the
chip be split into pieces, such that the PCI Express portion
remain running, while the core is powered down. (Otherwise, the
user is going to see side-effects from hot-insertion-like behavior.)
On a non-ZeroCore card, I would expect two regulators, one for core,
one for memory and memory interface. Perhaps the PCI Express can
draw power from the same one as the memory ? You'd probably want
to maintain video card memory state (self-refresh) while in the
ZeroCore condition, as otherwise, there's be a noticeable delay
if the video card was flushed to system memory.

This sounds like a question that only someone in tech support
at ATI or Nvidia could answer, and would likely require consultation
with engineering.

********

Using ZeroCore as a search term, I can see a user having problems with it.
And the problems are visible with the 12.10 driver (that's like a
month ago).

http://devforums.amd.com/game/messag...hreadid=161791

"I called AMD and told him about my problem. He assured me they know
about the ZeroCore problem and have been looking into it. The first
thing he said to try is installing the 12.11 beta drivers. If the
problem is still occuring then he wanted me to run msconfig and choose
Selective Startup, unchecking both the Load Services and Load Startup
options. If ZeroCore works then it means that either a Startup Service
or Startup Application is causing ZeroCore to fail. He gave me a
workaround, just turn off the monitor-sleep function, since that is the
functionality that turns ZeroCore on. If that's disabled ZeroCore
doesn't turn on, so it will just run at 20% until the whole system goes
into hibernate mode. It have been doing that and just turning off the
monitor with the power button since he said that ZeroCore is activated
when the monitor is told to go to sleep by the O.S."

That would be selecting S1 sleep state, as far as I know.

But at least I got a link to an article on when it was introduced.
It confirms my basic ideas on how you'd implement it (make an
island out of core, leave some peripheral stuff powered).

http://www.anandtech.com/show/5261/a...7970-review/11

HD 7970 was introduced a year ago (2011-12-22), according to this.
You'd expect it as a feature on any ATI card more modern than that
(unless a card is introduced using older silicon of course).

http://www.gpureview.com/videocards.php

The only practical way to watch ZeroCore, is with an external power
meter. As expecting the card to answer probes while in the ZeroCore
state, is expecting a lot. On lots of low power states on computer,
like say C6, the mere act of probing the device, upsets the power
state, and gives the wrong answer. It would take careful engineering
of the ZeroCore feature, to ensure you could actually actively
monitor the thing while it's drawing only 3 watts. Using external
monitoring, removes all uncertainty. Hearing the fan spin,
does *not* mean it is broken. Even at 3 watts dissipation,
the fan might need to run occasionally. And it would be stupid
to turn off the fan entirely, while in ZeroCore. The cooling
system should be ready for action at any time, as temperatures
permit. You don't want the GPU to overheat, in any circumstance.
Hearing the fan, suggests something is still drawing power though.

Paul
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Anyone playing games with a high-end video card on a low-end Athlon 64 X2 system? Ant AMD x86-64 Processors 2 February 1st 08 11:58 PM
High End AGP Video Card Advice Please Eoforheard Nvidia Videocards 16 November 16th 05 02:50 PM
High End AGP Video Card Advice Please Eoforheard Ati Videocards 16 November 16th 05 02:50 PM


All times are GMT +1. The time now is 10:43 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.