A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

NVidia 6800 Ultra power requirements



 
 
Thread Tools Display Modes
  #1  
Old April 14th 04, 09:23 PM
John Lewis
external usenet poster
 
Posts: n/a
Default NVidia 6800 Ultra power requirements

Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!

Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.

Also, from the Tom's Hardware Guide review :-
----------------------------------------------------------------------------------------
"..................................
We can also extrapolate the power requirements of the remaining cards
( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's
quoted maximum power draw of 110 Watts for the 6800 Ultra is correct.
Let's also assume that we reached that worst-case scenario during our
tests. That would mean that the Radeon 9800XT has a maximum power
requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts"
-----------------------------------------------------------
..... ever wondered why your 5950 and 9800 got so hot..... ? About
the same as the P4 3.4GHz CPU. However, the heat is spread over the
video board, since the above power-consumption includes that of
memory. My guess at the 5950/9800XT GPUs is around 70 watts...

Since DDR3 memory takes less power than DDR1 and assuming 256meg
will be the 6800Ultra default, , the actual power in the NV40
(16-pipe ) is probably around 95 watts, around 25-watts more than the
existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be
more than adequate for the new GPU.

BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!)
and a single-slot cooler, so those thinking of cheesily upgrading the
6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will
probably have to think again. Either the NV40 12-pipe is a totally
different mask or more likely the power supply to the core is divided
into groups of 4 pipes and that the 12-pipe will be parts that fail
(silicon-blemishes) to get all 16-pipes working to spec, with the
failures concentrated in one group, and the power then either
hardware-enabled external to the NV40 or bonded internally to
avoid that group. If externally hardware-enabled, any attempt to
modify the power-enabling arrangement for the 12-pipe NV40 to
enable all 16 pipes is highly likely to burn out the power-regulators
or sag the voltage too much, since regulator power-sharing is
obviously not available on the single-connector board..

John Lewis

  #2  
Old April 14th 04, 09:58 PM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Wed, 14 Apr 2004 20:23:48 GMT, (John Lewis)
wrote:

Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!


For those wondering about the above figures, Nvidia have chosen
to specify Power Supply watts, not the exact current required for
the 6800U from the +3.3. +5 and 12 rails. I have no idea why they
have chosen to do it this way, except that most non-technical
folk have no idea how to relate watts to current from the
point-of-view of matching a power-supply to the power-demands
of a card. Specifying 480watts is likely to give ample headroom
for worst-case variation in the currents available from that class of
power-supply. I suspect that the amperage demand in the 6800
Ultra on the 12volt supply is the culprit. On-board
switching-regulators are most efficient with the highest-value input
volts. Most, if not all, 480 watt power supplies can deliver at
least 18 amps on +12V (216 watts max) which should be just
about enough to drive the 6800 Ultra, the CPU, plus all disk
drives and other peripherals also requiring +12V.

Anyway, the difference in price between a 350 watt and
480 watt power-supply is tiny and most enthusiasts have
already upgrade their power-supplies to something in the
suggested range.

I would suggest that for any wondering whether their
existing power-supply is sufficient that they check the
Power Supply label. If AT LEAST 18 amps at 12 volts,
AND at least 350 watts they should probably give the
6800 Ultra a whirl before rushing out to buy a new supply.
Anything less than 18 amps from 12 volts -- budget for a
new power-supply, especially if you are running a
top-end CPU.

John Lewis




Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.


  #4  
Old April 14th 04, 09:59 PM
duralisis
external usenet poster
 
Posts: n/a
Default

John Lewis wrote:
Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!

Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.

Also, from the Tom's Hardware Guide review :-
----------------------------------------------------------------------------------------
"..................................
We can also extrapolate the power requirements of the remaining cards
( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's
quoted maximum power draw of 110 Watts for the 6800 Ultra is correct.
Let's also assume that we reached that worst-case scenario during our
tests. That would mean that the Radeon 9800XT has a maximum power
requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts"
-----------------------------------------------------------
.... ever wondered why your 5950 and 9800 got so hot..... ? About
the same as the P4 3.4GHz CPU. However, the heat is spread over the
video board, since the above power-consumption includes that of
memory. My guess at the 5950/9800XT GPUs is around 70 watts...

Since DDR3 memory takes less power than DDR1 and assuming 256meg
will be the 6800Ultra default, , the actual power in the NV40
(16-pipe ) is probably around 95 watts, around 25-watts more than the
existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be
more than adequate for the new GPU.

BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!)
and a single-slot cooler, so those thinking of cheesily upgrading the
6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will
probably have to think again. Either the NV40 12-pipe is a totally
different mask or more likely the power supply to the core is divided
into groups of 4 pipes and that the 12-pipe will be parts that fail
(silicon-blemishes) to get all 16-pipes working to spec, with the
failures concentrated in one group, and the power then either
hardware-enabled external to the NV40 or bonded internally to
avoid that group. If externally hardware-enabled, any attempt to
modify the power-enabling arrangement for the 12-pipe NV40 to
enable all 16 pipes is highly likely to burn out the power-regulators
or sag the voltage too much, since regulator power-sharing is
obviously not available on the single-connector board..

John Lewis


Keep in mind that the reviewer samples are only an A1 revision
engineering reference board, straight out of the bin. It's likely that
come production time, each manufacturer will choose things such as a
cooler, the mosfets, caps, vregs, etc.

So by the time you get your hands on one (a GF6-6800U), most card
manufacturers will likely go with a _strict_ reference design (including
the HSF and ramsinks), but you might find some that just barely fit in a
single slot with a different cooler.

Also, the voltage requirements right now are a "clean 12v", so depending
on what components are certified by NVidia, some manufacturers /JUST
MIGHT/ be able to get away with a single plug from a good PSU (like a
high rated 350w). NVidia's Q.C. program they've inacted since the FX
series launch is likely going to be the limiting factor in this.

What I'm really excited about is the next card revision. In the same way
the NV35 completed the promise of the NV30, The NV45 (or whatever core
revision gets stuck on the PCB in 6 mo.'s), will likely be a much more
efficient and slimmed fighter. Most likely even better memory
bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks ,
and hopefully a much more efficient power usage.

As for now, it looks like brute force is doing pretty well.
  #5  
Old April 14th 04, 10:25 PM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Wed, 14 Apr 2004 15:59:38 -0500, duralisis
wrote:




Keep in mind that the reviewer samples are only an A1 revision
engineering reference board, straight out of the bin. It's likely that
come production time, each manufacturer will choose things such as a
cooler, the mosfets, caps, vregs, etc.


The 110 watts is nVidia's max. spec.....according to the reviews.

So by the time you get your hands on one (a GF6-6800U), most card
manufacturers will likely go with a _strict_ reference design (including
the HSF and ramsinks), but you might find some that just barely fit in a
single slot with a different cooler.

Also, the voltage requirements right now are a "clean 12v", so depending
on what components are certified by NVidia, some manufacturers /JUST
MIGHT/ be able to get away with a single plug from a good PSU (like a
high rated 350w). NVidia's Q.C. program they've inacted since the FX
series launch is likely going to be the limiting factor in this.


I agree. But I will go with the 2-plug version if it is available. The
lower the PS impedance at the board, the less noisy the power
on the board.

What I'm really excited about is the next card revision. In the same way
the NV35 completed the promise of the NV30, The NV45 (or whatever core
revision gets stuck on the PCB in 6 mo.'s), will likely be a much more
efficient and slimmed fighter.


Not likely for quite a while. Manipulation in the current 0.13u
process will have very little power-savings without cutting
functionality. And the masking cost is horrendous. Probably
at least $1 million for a chip this size, assuming first-pass
no errors...

I would expect the next real iteration to be either a .09u
..065u shrink. IBM is working with AMD on .065u. And that
sure won't happen in six months.

The NV30 to NV35 interation was a significant DESIGN
improvement on the SAME process. The design of the NV40
seems near-perfect for the current graphics state-of-the-art.
Only benefit would be a mask-shrink, which would
potentially improve yield (assuming a stable process)
and significantly raise the number of die per wafer -
thus doubly reducing production costs.

John Lewis


Most likely even better memory
bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks ,
and hopefully a much more efficient power usage.

As for now, it looks like brute force is doing pretty well.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
6800 Ultra power requirements? Dr. Linux Nvidia Videocards 11 April 21st 04 08:22 AM
NVidia 6800 Card and System Builders John Lewis Nvidia Videocards 6 April 15th 04 09:16 PM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Nvidia Videocards 11 February 24th 04 06:29 AM
Won't Power Up after Power Outage Greg Lovern Homebuilt PC's 7 February 8th 04 01:47 PM
nVidia NV40, NV41, NV45 Information NV55 Nvidia Videocards 4 January 29th 04 02:02 PM


All times are GMT +1. The time now is 02:07 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.