A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

NVidia 6800 Ultra power requirements



 
 
Thread Tools Display Modes
  #1  
Old April 14th 04, 09:23 PM
John Lewis
external usenet poster
 
Posts: n/a
Default NVidia 6800 Ultra power requirements

Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!

Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.

Also, from the Tom's Hardware Guide review :-
----------------------------------------------------------------------------------------
"..................................
We can also extrapolate the power requirements of the remaining cards
( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's
quoted maximum power draw of 110 Watts for the 6800 Ultra is correct.
Let's also assume that we reached that worst-case scenario during our
tests. That would mean that the Radeon 9800XT has a maximum power
requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts"
-----------------------------------------------------------
..... ever wondered why your 5950 and 9800 got so hot..... ? About
the same as the P4 3.4GHz CPU. However, the heat is spread over the
video board, since the above power-consumption includes that of
memory. My guess at the 5950/9800XT GPUs is around 70 watts...

Since DDR3 memory takes less power than DDR1 and assuming 256meg
will be the 6800Ultra default, , the actual power in the NV40
(16-pipe ) is probably around 95 watts, around 25-watts more than the
existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be
more than adequate for the new GPU.

BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!)
and a single-slot cooler, so those thinking of cheesily upgrading the
6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will
probably have to think again. Either the NV40 12-pipe is a totally
different mask or more likely the power supply to the core is divided
into groups of 4 pipes and that the 12-pipe will be parts that fail
(silicon-blemishes) to get all 16-pipes working to spec, with the
failures concentrated in one group, and the power then either
hardware-enabled external to the NV40 or bonded internally to
avoid that group. If externally hardware-enabled, any attempt to
modify the power-enabling arrangement for the 12-pipe NV40 to
enable all 16 pipes is highly likely to burn out the power-regulators
or sag the voltage too much, since regulator power-sharing is
obviously not available on the single-connector board..

John Lewis

  #2  
Old April 14th 04, 09:58 PM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Wed, 14 Apr 2004 20:23:48 GMT, (John Lewis)
wrote:

Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!


For those wondering about the above figures, Nvidia have chosen
to specify Power Supply watts, not the exact current required for
the 6800U from the +3.3. +5 and 12 rails. I have no idea why they
have chosen to do it this way, except that most non-technical
folk have no idea how to relate watts to current from the
point-of-view of matching a power-supply to the power-demands
of a card. Specifying 480watts is likely to give ample headroom
for worst-case variation in the currents available from that class of
power-supply. I suspect that the amperage demand in the 6800
Ultra on the 12volt supply is the culprit. On-board
switching-regulators are most efficient with the highest-value input
volts. Most, if not all, 480 watt power supplies can deliver at
least 18 amps on +12V (216 watts max) which should be just
about enough to drive the 6800 Ultra, the CPU, plus all disk
drives and other peripherals also requiring +12V.

Anyway, the difference in price between a 350 watt and
480 watt power-supply is tiny and most enthusiasts have
already upgrade their power-supplies to something in the
suggested range.

I would suggest that for any wondering whether their
existing power-supply is sufficient that they check the
Power Supply label. If AT LEAST 18 amps at 12 volts,
AND at least 350 watts they should probably give the
6800 Ultra a whirl before rushing out to buy a new supply.
Anything less than 18 amps from 12 volts -- budget for a
new power-supply, especially if you are running a
top-end CPU.

John Lewis




Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.


  #3  
Old April 14th 04, 09:59 PM
duralisis
external usenet poster
 
Posts: n/a
Default

John Lewis wrote:
Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!

Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.

Also, from the Tom's Hardware Guide review :-
----------------------------------------------------------------------------------------
"..................................
We can also extrapolate the power requirements of the remaining cards
( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's
quoted maximum power draw of 110 Watts for the 6800 Ultra is correct.
Let's also assume that we reached that worst-case scenario during our
tests. That would mean that the Radeon 9800XT has a maximum power
requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts"
-----------------------------------------------------------
.... ever wondered why your 5950 and 9800 got so hot..... ? About
the same as the P4 3.4GHz CPU. However, the heat is spread over the
video board, since the above power-consumption includes that of
memory. My guess at the 5950/9800XT GPUs is around 70 watts...

Since DDR3 memory takes less power than DDR1 and assuming 256meg
will be the 6800Ultra default, , the actual power in the NV40
(16-pipe ) is probably around 95 watts, around 25-watts more than the
existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be
more than adequate for the new GPU.

BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!)
and a single-slot cooler, so those thinking of cheesily upgrading the
6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will
probably have to think again. Either the NV40 12-pipe is a totally
different mask or more likely the power supply to the core is divided
into groups of 4 pipes and that the 12-pipe will be parts that fail
(silicon-blemishes) to get all 16-pipes working to spec, with the
failures concentrated in one group, and the power then either
hardware-enabled external to the NV40 or bonded internally to
avoid that group. If externally hardware-enabled, any attempt to
modify the power-enabling arrangement for the 12-pipe NV40 to
enable all 16 pipes is highly likely to burn out the power-regulators
or sag the voltage too much, since regulator power-sharing is
obviously not available on the single-connector board..

John Lewis


Keep in mind that the reviewer samples are only an A1 revision
engineering reference board, straight out of the bin. It's likely that
come production time, each manufacturer will choose things such as a
cooler, the mosfets, caps, vregs, etc.

So by the time you get your hands on one (a GF6-6800U), most card
manufacturers will likely go with a _strict_ reference design (including
the HSF and ramsinks), but you might find some that just barely fit in a
single slot with a different cooler.

Also, the voltage requirements right now are a "clean 12v", so depending
on what components are certified by NVidia, some manufacturers /JUST
MIGHT/ be able to get away with a single plug from a good PSU (like a
high rated 350w). NVidia's Q.C. program they've inacted since the FX
series launch is likely going to be the limiting factor in this.

What I'm really excited about is the next card revision. In the same way
the NV35 completed the promise of the NV30, The NV45 (or whatever core
revision gets stuck on the PCB in 6 mo.'s), will likely be a much more
efficient and slimmed fighter. Most likely even better memory
bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks ,
and hopefully a much more efficient power usage.

As for now, it looks like brute force is doing pretty well.
  #4  
Old April 14th 04, 10:25 PM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Wed, 14 Apr 2004 15:59:38 -0500, duralisis
wrote:




Keep in mind that the reviewer samples are only an A1 revision
engineering reference board, straight out of the bin. It's likely that
come production time, each manufacturer will choose things such as a
cooler, the mosfets, caps, vregs, etc.


The 110 watts is nVidia's max. spec.....according to the reviews.

So by the time you get your hands on one (a GF6-6800U), most card
manufacturers will likely go with a _strict_ reference design (including
the HSF and ramsinks), but you might find some that just barely fit in a
single slot with a different cooler.

Also, the voltage requirements right now are a "clean 12v", so depending
on what components are certified by NVidia, some manufacturers /JUST
MIGHT/ be able to get away with a single plug from a good PSU (like a
high rated 350w). NVidia's Q.C. program they've inacted since the FX
series launch is likely going to be the limiting factor in this.


I agree. But I will go with the 2-plug version if it is available. The
lower the PS impedance at the board, the less noisy the power
on the board.

What I'm really excited about is the next card revision. In the same way
the NV35 completed the promise of the NV30, The NV45 (or whatever core
revision gets stuck on the PCB in 6 mo.'s), will likely be a much more
efficient and slimmed fighter.


Not likely for quite a while. Manipulation in the current 0.13u
process will have very little power-savings without cutting
functionality. And the masking cost is horrendous. Probably
at least $1 million for a chip this size, assuming first-pass
no errors...

I would expect the next real iteration to be either a .09u
..065u shrink. IBM is working with AMD on .065u. And that
sure won't happen in six months.

The NV30 to NV35 interation was a significant DESIGN
improvement on the SAME process. The design of the NV40
seems near-perfect for the current graphics state-of-the-art.
Only benefit would be a mask-shrink, which would
potentially improve yield (assuming a stable process)
and significantly raise the number of die per wafer -
thus doubly reducing production costs.

John Lewis


Most likely even better memory
bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks ,
and hopefully a much more efficient power usage.

As for now, it looks like brute force is doing pretty well.


  #5  
Old April 15th 04, 04:43 AM
DaveL
external usenet poster
 
Posts: n/a
Default

Blow it out the back of the case with a case fan. But I don't blame you for
not thinking of the obvious. There are a lot of novices on this NG.

DaveL


wrote in message
...

Plus were are you going to put all the HEAT that this Crap card generate,

as
it blows into the Case..

Bad Bad Move..


  #6  
Old April 15th 04, 05:20 AM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Thu, 15 Apr 2004 15:12:17 +1200, wrote:




Plus were are you going to put all the HEAT that this Crap card generate, as
it blows into the Case..


Please read my original posting again.

According to Tom's hardware the 5950/9800XT are about 90 watts max.
for the board.

According to nVidia, 110 watts max for the 6800 board. The NV40 chip
dissipates about 25 watts more than NV35/R350, the DDR3 memory a
little less than DDR1 ( 256Mbyte). Also, according to the Xbit lab
review, the 6800 fan never got to full speed in their test setup.
Anyway, you only need to get rid of 25 watts more heat. The power
supply requirement comes from the need for a very low power-line
impedance at the card, combined with the fact that there is a wimpy
12v supply on many sub-400watt power-supplies. Remember that the
fastest CPUs consume about 8 amps already from the 12volt supply.


Bad Bad Move..


Huh ? Do you want to wait for a 90nm or 65nm process
to be mature before getting this sort of performance ? That
is the only way to cut the power without cutting the
performance.



er.... I think that you will find that the max. dissipation of the
R420 will be very similar, if Ati volunteer to tell you at all........
No doubt Tom or Anandtech will measure it anyway...........


John Lewis




Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.


----------------------------------------------------------------------------------------------------
Life is not measured by the number of breaths we take, but by the moments that take our breath away. (George Carlin)


  #7  
Old April 15th 04, 07:05 AM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Thu, 15 Apr 2004 16:29:41 +1200, wrote:




I am referring to the Heat,


25 watts is 25 watts of heat.........
Remember the heat from a 25 watt light bulb ? That is the extra heat
you will need to get rid of if you replace a 9800XT with the 6800
Ultra. Hopefully, that bulb will now turn on for you.

For comparison, a Prescott3.4 GHz processor dissipates103 watts max,
just 7 watts less than the 6800Ultra, max.. Northwood 3.4GHz is 89
watts.

Now maybe you can see why a 480 watt power-supply is not
entirely out of line when you add in the other peripherals.

any way the ATI 9800xt cards do not seem to have a
extra power connector..?

Unless I am blind..


Neither does the non-Ultra version of the 6800, which runs with
12 pipes, a 1-slot-high heat-sink and probably has very similar total
power consumption to the 9800XT, since the GDDR3 memory
runs cooler than DDR1. BTW, for neither the 9800XT or the 6800
non-Ultra would I ever recommend putting a card in the adjacent
PCI slot.

The power-connectors on the 6800Ultra are marked Primary
and Secondary. nVidia is probably paranoid about having very
low line-impedances to the card to avoid digital glitches, since
the peak-current surges to the GPU will be very spiky.

John Lewis




er.... I think that you will find that the max. dissipation of the
R420 will be very similar, if Ati volunteer to tell you at all........
No doubt Tom or Anandtech will measure it anyway...........


John Lewis




Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.


----------------------------------------------------------------------------------------------------
Life is not measured by the number of breaths we take, but by the moments that take our breath away. (George Carlin)


----------------------------------------------------------------------------------------------------
Life is not measured by the number of breaths we take, but by the moments that take our breath away. (George Carlin)


  #9  
Old April 15th 04, 08:30 PM
John Lewis
external usenet poster
 
Posts: n/a
Default

On 15 Apr 2004 02:26:53 -0700, (Nada) wrote:

(John Lewis) wrote:
On Thu, 15 Apr 2004 15:12:17 +1200,
wrote:




Plus were are you going to put all the HEAT that this Crap card generate, as
it blows into the Case..


Please read my original posting again.

According to Tom's hardware the 5950/9800XT are about 90 watts max.
for the board.

According to nVidia, 110 watts max for the 6800 board. The NV40 chip
dissipates about 25 watts more than NV35/R350, the DDR3 memory a
little less than DDR1 ( 256Mbyte). Also, according to the Xbit lab
review, the 6800 fan never got to full speed in their test setup.
Anyway, you only need to get rid of 25 watts more heat. The power
supply requirement comes from the need for a very low power-line
impedance at the card, combined with the fact that there is a wimpy
12v supply on many sub-400watt power-supplies. Remember that the
fastest CPUs consume about 8 amps already from the 12volt supply.


John, you wouldn't happen to have a link to a Intel Pentium IV Prescot
power consumption test sites? I heard those models produce more watts
than HAL2000. This is going to create mass hysteria with upgrading
the power supplies again.


Yes, indeed and it should be a permanent bookmark for anybody
building Intel-based PCs or trying to find a particular mask rev of
a processor.

http://processorfinder.intel.com/scripts/default.asp

Pick your processor. Click on the appropriate SL code and bingo...
all the important shortform infomation on mask version, wattage, max
operating core temp etc. For example, note that the P4EE is spec'd
for a max core temperature of 64 degrees C only. Doesn't mean
that it will stop working at 65, but it might slow down a little....

Prescott 3.4 is 103 watts max
Northwood 3.4 is 89 watts max.

For reference the whole nVidia 6800Ultra board has a maximum
dissipation of only 110 watts Compared to the block heatsinks of
the processors, the nVidia reference thermal solution is huge (and
apparently very quiet ). I'll take huge and quiet any day...

BTW, the weight of the 6800Ultra is no greater than the 5950U, since
the heatsinking is all aluminum. I suspect that some of the
vendors of Nvidia cards (MSI and Leadtek come to mind ) will
replace the aluminum with copper to get the Ultra height down.
Beware of the weight of a copper thermal solution.... No matter what
solution is adopted, the PCI space next to the 6800 or 680 Ultra
should be always left completely free for adequate ventilation, unless
the alternate solution vents directly through the rear -- and with the
2 large DVI connectors that will be very difficult to do effectively
in a 1-slot space.

John Lewis

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
6800 Ultra power requirements? Dr. Linux Nvidia Videocards 11 April 21st 04 08:22 AM
NVidia 6800 Card and System Builders John Lewis Nvidia Videocards 6 April 15th 04 09:16 PM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Nvidia Videocards 11 February 24th 04 06:29 AM
Won't Power Up after Power Outage Greg Lovern Homebuilt PC's 7 February 8th 04 01:47 PM
nVidia NV40, NV41, NV45 Information NV55 Nvidia Videocards 4 January 29th 04 02:02 PM


All times are GMT +1. The time now is 05:59 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
Copyright 2004-2022 HardwareBanter.
The comments are property of their posters.