View Single Post
  #9  
Old July 11th 03, 11:04 PM
Stephen Smith
external usenet poster
 
Posts: n/a
Default

"Dave" wrote in message
...
I'm thinking Nvidia needs to step in and put a stop to all this confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave


Dave, I'm interested -- HOW do you "configure" it exactly?!! Especially to
GeForce 3 standard?!

I tried overclocking the 5200 in the system I was using but it made no
/noticable/ difference with Max Payne and BloodRayne, the guinea-pigs. I
even managed to O/C it too far, led to desktop corruption, BloodRayne
wouldn't load, etc, etc.

Now I'm confused........

http://www.msicomputer.com/product/v...l=FX5200_TD128

The image clearly shows a DVI connector and a fan.

How come the MSI 5200 I was using DIDN'T have a DVI connector _or_ fan on
it? (it had just a heatsink) It definately had 128MB, so it wasn't the 64MB
model.

Do they make another version that isn't listed on their website, per chance?

Could it have been the PC causing the duff performance?

Gigabyte GA-7VTXE motherboard
AMD Athlon 1800+ (running at 1533MHz, it claimed)
256MB RAM. [low for BloodRayne, I know]
Windows 98SE, VIA 4-in-1 drivers installed, etc.

Stephen.