View Single Post
  #3  
Old April 14th 04, 09:59 PM
duralisis
external usenet poster
 
Posts: n/a
Default

John Lewis wrote:
Nvidia 6800 Ultra: 110 watts maximum

Power Supply:-

480 watts minimum !!!

Preferably 4 "disk-drive" cables available from power-supply.
Two must be available EXCLUSIVELY for the 6800 Ultra.
( minor exception --- auxiliary fans are allowed to co-use
these cable feeds )

Power-supplies with only 3 cables will need some $2 splitters
on the third cable for the various disk drives --- not a real
problem, since their power consumption is very low.

Also, from the Tom's Hardware Guide review :-
----------------------------------------------------------------------------------------
"..................................
We can also extrapolate the power requirements of the remaining cards
( FX5950, Ati 9800XT ) from these numbers. Assuming that NVIDIA's
quoted maximum power draw of 110 Watts for the 6800 Ultra is correct.
Let's also assume that we reached that worst-case scenario during our
tests. That would mean that the Radeon 9800XT has a maximum power
requirement of about 91,5 Watts, while the FX 5950 needs 93,5 Watts"
-----------------------------------------------------------
.... ever wondered why your 5950 and 9800 got so hot..... ? About
the same as the P4 3.4GHz CPU. However, the heat is spread over the
video board, since the above power-consumption includes that of
memory. My guess at the 5950/9800XT GPUs is around 70 watts...

Since DDR3 memory takes less power than DDR1 and assuming 256meg
will be the 6800Ultra default, , the actual power in the NV40
(16-pipe ) is probably around 95 watts, around 25-watts more than the
existing GPUs, so the nice big 2-slot cooler on 6800 Ultra should be
more than adequate for the new GPU.

BTW, the 6800 non-Ultra (12-pipe) will have only 1 power-connector (!)
and a single-slot cooler, so those thinking of cheesily upgrading the
6800 non-Ultra 12-pipe to the 16-pipe by a BIOS/hardware hack will
probably have to think again. Either the NV40 12-pipe is a totally
different mask or more likely the power supply to the core is divided
into groups of 4 pipes and that the 12-pipe will be parts that fail
(silicon-blemishes) to get all 16-pipes working to spec, with the
failures concentrated in one group, and the power then either
hardware-enabled external to the NV40 or bonded internally to
avoid that group. If externally hardware-enabled, any attempt to
modify the power-enabling arrangement for the 12-pipe NV40 to
enable all 16 pipes is highly likely to burn out the power-regulators
or sag the voltage too much, since regulator power-sharing is
obviously not available on the single-connector board..

John Lewis


Keep in mind that the reviewer samples are only an A1 revision
engineering reference board, straight out of the bin. It's likely that
come production time, each manufacturer will choose things such as a
cooler, the mosfets, caps, vregs, etc.

So by the time you get your hands on one (a GF6-6800U), most card
manufacturers will likely go with a _strict_ reference design (including
the HSF and ramsinks), but you might find some that just barely fit in a
single slot with a different cooler.

Also, the voltage requirements right now are a "clean 12v", so depending
on what components are certified by NVidia, some manufacturers /JUST
MIGHT/ be able to get away with a single plug from a good PSU (like a
high rated 350w). NVidia's Q.C. program they've inacted since the FX
series launch is likely going to be the limiting factor in this.

What I'm really excited about is the next card revision. In the same way
the NV35 completed the promise of the NV30, The NV45 (or whatever core
revision gets stuck on the PCB in 6 mo.'s), will likely be a much more
efficient and slimmed fighter. Most likely even better memory
bandwidth, some tweaked RGMSAA modes, probably a few Doom III tricks ,
and hopefully a much more efficient power usage.

As for now, it looks like brute force is doing pretty well.