View Single Post
  #9  
Old July 18th 03, 06:50 AM
John Lewis
external usenet poster
 
Posts: n/a
Default

On Thu, 17 Jul 2003 23:26:49 GMT, "Chimera" wrote:

snip

Unlike the CPU chip-temp monitoring on modern motherboards, most
video cards have no GPU chip- temperature monitors, although the
design and thermal-stress rules are exactly the same as for CPUs. The
latest video cards ATi9700, 9800, FX5800, FX5900 cost more than most
CPUs and disspate as much power as a 2.6GHz P4. Out of these only
the FX5900 has built-in user-accessible chip-temperature monitoring.

Seems to me the FX5900 needs it! The FX chips are the AMD T-Birds of the
GPU industry!



And the DATA to back this statement up ?

You may be confusing the very poor cooling solution on the FX5800
with the actual chip temperature of the GPU.

Neither nVidia NOR Ati publicly publish any diissipation figures
for their GPUs.

The process, transistor-count and clock-speed are the key determining
issues. There is no silver bullet. The FX5900 15% higher transistor
count and sligfhtly-higher clock speed, but 0.13u process (and lower
core Vdd ) instead of 0.15u on the 9800 results in about the same
power-dissipation.

Come back when you have real data to back up your argument.

Go install a temp-monitor on the 9800 GPU. You may get
a very interesting surprise/shock.........

As it is you may be blowing hot-air..............

John Lewis