A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Lower Power Utilization for High End Video Card?



 
 
Thread Tools Display Modes
  #1  
Old November 25th 12, 11:51 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

I have an older XP computer in a living room on which I installed an nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?

--
W


  #2  
Old November 25th 12, 05:38 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

W wrote:
I have an older XP computer in a living room on which I installed an nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

Paul
  #3  
Old November 25th 12, 08:21 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Tom
external usenet poster
 
Posts: 6
Default Lower Power Utilization for High End Video Card?

Hey Paul... do you have a web site that may collect your learned answers? I
certainly get a lot from your explanations.

T2

"Paul" wrote in message ...

W wrote:
I have an older XP computer in a living room on which I installed an
nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an
inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual
machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

Paul

  #4  
Old November 26th 12, 12:19 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

Tom wrote:
Hey Paul... do you have a web site that may collect your learned
answers? I certainly get a lot from your explanations.

T2


Google Groups archives the contents of the news groups.

Most of the info I gather, is "out there". It's available
on enthusiast sites, where occasionally someone from the
factory might mention some of this stuff. In cases like
with Intel redesigning their silicon, there have been
articles in the public domain about that. (Intel took
a lot more chances during its evolution than AMD did.
Intel turned their transistors "upside-down" for example,
when they redid their smaller geometry processes. AMD has
one tenth the staff, and can't afford that level of
research.)

I have experience at a silicon fab, but that's back
in the days when leakage current was precisely "zero".
So my experience doesn't count for anything. My old fab
is gone now, and a drug company uses the building.
Anything silicon related has long since been thrown away.

This is the article I was looking for earlier, but
couldn't find it again. Some per-rail power
measurements from 2010. Some of the cards have
pretty low power, like the HD 5450 at 3.2 watts (idle)
and the Geforce 210 at 3.9 watts (idle). The problem
now, is getting an article of this quality, in the
year 2012.

http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The HD 5970 there, is 44 watts (idle) and 240.7 watts (3D_max).
So that's like a factor of 5 between the two.
(I don't count OCCT, as it's one of a few synthetic tests
that I wouldn't normally run here. In fact, some graphics
drivers have features to detect things like OCCT or Furmark,
and detune things so the card doesn't get damaged.)

http://www.generation-gpu.fr/UserImg...D5870/OCCT.jpg

So if a person can stand the crappy performance of a low-end
card (for gaming), their idle power is exceptionally low.
Cards like my old 9800 Pro, might be around 35 watts by
comparison. Your room isn't going to get very
warm, with a 3.2 watt card.

Paul
  #5  
Old November 26th 12, 12:02 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an

nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with

the
monitor turned off and the computer doing nothing but displaying an

inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual

machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a

minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.


It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the system
power use when no video card is installed to get any kind of proxy for power
used by the video card alone?

This article:


http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go into
an idle mode that uses 3 watts. Effectively the card turns itself off:

"Better still, the excellent ZeroCore Power feature gives a 16% reduction in
energy consumption at idle and allows you to turn the card's fan off. For
this, the computer has to be configured so that it switches the screen off
after a given period of time. As soon as the screen goes on standby, the
card is almost entirely switched off and only consumes 3 watts of power,
bringing the overall consumption of our test computer down to 74 watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?

--
W


  #6  
Old November 26th 12, 04:12 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Paul
external usenet poster
 
Posts: 13,364
Default Lower Power Utilization for High End Video Card?

W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an

nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with

the
monitor turned off and the computer doing nothing but displaying an

inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual

machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a

minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.


It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the system
power use when no video card is installed to get any kind of proxy for power
used by the video card alone?

This article:


http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go into
an idle mode that uses 3 watts. Effectively the card turns itself off:

"Better still, the excellent ZeroCore Power feature gives a 16% reduction in
energy consumption at idle and allows you to turn the card's fan off. For
this, the computer has to be configured so that it switches the screen off
after a given period of time. As soon as the screen goes on standby, the
card is almost entirely switched off and only consumes 3 watts of power,
bringing the overall consumption of our test computer down to 74 watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse around.
These are not system power numbers, these are video card only, measured with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.

http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.

Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.

Paul
  #7  
Old November 26th 12, 07:58 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
W[_3_]
external usenet poster
 
Posts: 118
Default Lower Power Utilization for High End Video Card?

"Paul" wrote in message
...
W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an

nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with

the
monitor turned off and the computer doing nothing but displaying an

inactive
Windows desktop, the nVidia card is consuming about 160 watts of

energy
continuously. Since the system is only used to run a few virtual

machines
about 99% of the time, that is a lot of wasted energy. I want a

card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a

minimum
power utilization mode when the card is not being used heavily? I

read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details

on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.


It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the

system
power use when no video card is installed to get any kind of proxy for

power
used by the video card alone?

This article:



http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go

into
an idle mode that uses 3 watts. Effectively the card turns itself off:

"Better still, the excellent ZeroCore Power feature gives a 16%

reduction in
energy consumption at idle and allows you to turn the card's fan off.

For
this, the computer has to be configured so that it switches the screen

off
after a given period of time. As soon as the screen goes on standby, the
card is almost entirely switched off and only consumes 3 watts of power,
bringing the overall consumption of our test computer down to 74 watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse around.
These are not system power numbers, these are video card only, measured

with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they

exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.


http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in

those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.


If I believe the AMD web site, the RADEON ZeroCore power technology will
put the video card into a sleep state that uses less than 4 watts while the
system is running with the screen off. Good or bad, my computer will act
as a server and the system will not sleep. But the screen will be resting
99% of the time and during that rest time I want to minimize the power draw.

What is the most powerful AMD video card that fully implements the ZeroCore
technology today?


Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.


It is not that unusual for a computer to act like a server and run virtual
machines in the background. In my case those run a home active directory
and some other administrative servers.

--
W


  #8  
Old November 26th 12, 05:12 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Homer Jay Simpson[_3_]
external usenet poster
 
Posts: 26
Default Lower Power Utilization for High End Video Card?

"W" wrote in message
...
"Paul" wrote in message
...
W wrote:
"Paul" wrote in message
...
W wrote:
I have an older XP computer in a living room on which I installed an
nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with
the
monitor turned off and the computer doing nothing but displaying an
inactive
Windows desktop, the nVidia card is consuming about 160 watts of

energy
continuously. Since the system is only used to run a few virtual
machines
about 99% of the time, that is a lot of wasted energy. I want a

card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a
minimum
power utilization mode when the card is not being used heavily? I

read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details

on
that?

This is true of newer cards from either company.

The ratio of 3D_max to Idle is improving. Your card could be 70W
at idle (measured at the card), and newer cards have actually
improved on that.

Xbitlabs.com used to do per-rail power measurement, using
a specially modified motherboard, but they've stopped doing
that, and so we no longer have those measurements available
for newer cards. All they do now is system power measurements,
which are useless for determining the exact 3D_max to Idle ratio.
(If they had a "system power with no video present" measurement,
then, their measurements would have some value.)

All I can tell you, is a newer card will *likely* be lower
at idle. The 8800 is still back in the "bad" days.

This is another one of those sites that only does system power.
HD 7970 "system idle" 113W, "system 3D Max" 391W. So the idle
is better there. Your card is around 70W idle, 131W max, which
means ratio-wise, it doesn't do that well at idle.

http://www.anandtech.com/show/5261/a...7970-review/27

There was an era, when silicon gates were relatively leaky.
Intel Prescott was an example of that, where 25% of DC power
was just wasted as heat, and did nothing for you. While chips
still leak, more development work has gone into making
structures for gates, which don't leak quite as bad as that.
(The geometry of the gates shrunk, and the gates and silicon
structures had to be redesigned to prevent leakage from them
rising worse than the Prescott era.) The other improvement
comes from clock gating - where desktop cards are now closer
to how mobile graphics work, in terms of clock gating.

There's a good chance, that no matter what card you buy,
it'll do better than your 70W idle 8800 family card.

It's not clear what the data you report means if it is a measurement of
total power used by the system. You would have to subtract out the

system
power use when no video card is installed to get any kind of proxy for

power
used by the video card alone?

This article:



http://us.digitalversus.com/graphics...4621/test.html

in the section named "Power Use" is suggesting that the ATI 7850 can go

into
an idle mode that uses 3 watts. Effectively the card turns itself
off:

"Better still, the excellent ZeroCore Power feature gives a 16%

reduction in
energy consumption at idle and allows you to turn the card's fan off.

For
this, the computer has to be configured so that it switches the screen

off
after a given period of time. As soon as the screen goes on standby,
the
card is almost entirely switched off and only consumes 3 watts of
power,
bringing the overall consumption of our test computer down to 74
watts."

On my system, the nVidia 8800 Ultra is consuming 160 watts *just for
the
power card* and when the system is in idle state. 3 watts versus 160
watts is a huge difference?


The Xbitlabs numbers for an 8800 non-Ultra were 70W idle and 131W busy.

In these kinds of articles, as far as I know, the "Idle" power is with
desktop still visible and the user has stopped pushing the mouse around.
These are not system power numbers, these are video card only, measured

with
current shunt in 3.3V_slot, 12V_slot, 12V_PCIE#1 and 12V_PCIE#2 (if they

exist).
Xbitlabs have stopped doing it this way, because it looks like they got
another motherboard, and aren't interested in fitting the shunts.


http://www.xbitlabs.com/articles/gra...0_3.html#sect0

The idle power of the card varies with the card's processing power in

those
charts. The HD 5970 for example, is still 44.4W for the card. A low
end card like the HD 5450 is 3.2W idle.


If I believe the AMD web site, the RADEON ZeroCore power technology will
put the video card into a sleep state that uses less than 4 watts while
the
system is running with the screen off. Good or bad, my computer will act
as a server and the system will not sleep. But the screen will be resting
99% of the time and during that rest time I want to minimize the power
draw.

What is the most powerful AMD video card that fully implements the
ZeroCore
technology today?


Turning off the screen is good for servers, but for a desktop
isn't the best choice. Mainly because a desktop is more
interactive, and if you aren't using it, chances are you've
used S3 sleep or S4 Hibernate.


It is not that unusual for a computer to act like a server and run virtual
machines in the background. In my case those run a home active directory
and some other administrative servers.

--
W


On AMD's web page for the AMD Radeon HD 7970 GHz Edition:

http://www.amd.com/us/products/deskt...7970GHz.aspx#3

AMD ZeroCore Power technology*
. Ultra-low idle power when the system's display is off
. Efficient low power mode for desktop work
. Secondary GPUs in an AMD CrossFireT technology configuration power down
when unneeded

* AMD PowerPlayT, AMD PowerTune and AMD ZeroCore Power are technologies
offered by certain AMD RadeonT products, which are designed to intelligently
manage GPU power consumption in response to certain GPU load conditions.
Not all products feature all technologies - check with your component or
system manufacturer for specific model capabilities.

It seems to be up to the add-in board partner whether or not they want to
implement the feature.



  #9  
Old November 28th 12, 05:23 PM posted to alt.comp.periphs.videocards.nvidia
PW[_3_]
external usenet poster
 
Posts: 13
Default Lower Power Utilization for High End Video Card?

On Sun, 25 Nov 2012 03:51:09 -0800, "W"
wrote:

I have an older XP computer in a living room on which I installed an nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


I believe my eVGA GTX680 SuperClocked is a low power card. I am not
sure how to check how many watts it uses.
  #10  
Old November 28th 12, 06:45 PM posted to alt.comp.periphs.videocards.nvidia
Homer Jay Simpson[_3_]
external usenet poster
 
Posts: 26
Default Lower Power Utilization for High End Video Card?

"PW" wrote in message
...
On Sun, 25 Nov 2012 03:51:09 -0800, "W"
wrote:

I have an older XP computer in a living room on which I installed an
nVidia
GEForce 8800 Ultra. The card performs well, but to my disbelief with the
monitor turned off and the computer doing nothing but displaying an
inactive
Windows desktop, the nVidia card is consuming about 160 watts of energy
continuously. Since the system is only used to run a few virtual
machines
about 99% of the time, that is a lot of wasted energy. I want a card
that can stop burning watts when it is in a low use mode.

Does anyone make a top tier video card that can power itself to a minimum
power utilization mode when the card is not being used heavily? I read
somewhere that some newer version of AMD Eyefinity could get power
utilization in an unused mode down under 20 watts. What are details on
that?


I believe my eVGA GTX680 SuperClocked is a low power card. I am not
sure how to check how many watts it uses.

Your specific card model's idle power draw is 15.5 Watts.



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Anyone playing games with a high-end video card on a low-end Athlon 64 X2 system? Ant AMD x86-64 Processors 2 February 1st 08 10:58 PM
High End AGP Video Card Advice Please Eoforheard Nvidia Videocards 16 November 16th 05 01:50 PM
High End AGP Video Card Advice Please Eoforheard Ati Videocards 16 November 16th 05 01:50 PM


All times are GMT +1. The time now is 11:56 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.