View Single Post
  #7  
Old March 11th 06, 07:10 AM posted to alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default SLI 7800GT or 1x 7900GT?

On 10 Mar 2006 10:35:33 -0800, "johns" wrote:

If you google .. there's a 1Kwatt psupply out there just
for the nVidia cards. I think it is only $600, and comes
with a dozen eggs.

johns


Warning: ATi troll.

(For the OP, please see:-

http://www.anandtech.com/video/showdoc.aspx?i=2717 )

For the troll, the following power-related information is extracted
from the above article and other sources:-

( OT mode on )

Even the dual 7900GTX(512) will do quite nicely on a 600watt SLI rated

supply (~ $140 max.). Need a PC Power & Cooling 850 watt (~ $450)
only for dual ATi X1900XTX (512) and for dual configurations of the
now-obsolete 7800GTX512.

Read on....

The sharply lower power consumption is not surprising since the ATi
X1900XT(X) GPU (R580) is 353mm square and the 7900GT(X) (G71)
is 196mm square, which according to my math makes the ATi R580 3.24
times the area of the G71, both on the 90nm process. Maximum power
consumption of the G71 GPU at GTX speed is about 50% that of the R580
GPU at XTX speed, resulting in a power ratio at board level of about
75%.

The 7900GTX512 has almost exactly the same maximum power consumption
as a current 7800GTX(256) at default clock speed. And a dual-SLI
configuration of these latter boards does not need any more than a
600 watt-rated SLI ATX 2.x supply ( Enermax 701 or similar)

ATi's R580 design is very wasteful of silicon area and power, and
results in a die cost at least 4x that of the G71 (3.24x plus a yield
loss factor for the much larger die). This translates into a ~2x cost
ratio for ATi/nVidia for the packaged and tested GPUs, factoring in
the packaging and test overheads. Not good news for ATi at all,
especially since the power- and cost-efficiency differences will also
be mirrored in the Xbox360 and PS3 GPU designs.

The G71 has ~ 40 million fewer transistors than the G70 (7800GTX) with
improved functionality too. nVidia has cranked the G71 and G73 with
the deliberate intent of competitively and drastically lowering prices
but simultaneously offering much higher clock speeds, lower power and
no loss of functionality over the G70/72. Also a further die shrink of
the G71 and G73 on to the TSMC 80nm 'half-node' process can happen
at any time with zero design changes when the manufacturing yields are
judged satisfactory -- the design rules do not change.

(OT mode off)

John Lewis