HardwareBanter

HardwareBanter (http://www.hardwarebanter.com/index.php)
-   Nvidia Videocards (http://www.hardwarebanter.com/forumdisplay.php?f=16)
-   -   Serious waste of chips and PCB with fx5200 (http://www.hardwarebanter.com/showthread.php?t=49654)

Bratboy July 10th 03 11:57 PM

Serious waste of chips and PCB with fx5200
 
Buyer beware, apparently the only fx5200 by gainward that uses the 128 bit
mem path is the Golden Sample versions all others in the 5200's are using a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version) to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only uses
a 64 bit bus. You have to go look at their site to learn that. Anyway an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even $50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing they
have some sort of step up p[rogram to deal with this but know better than to
hold my breath that they will do anything.

Just Jess



Chimera July 11th 03 07:11 AM

Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128 bit
mem path is the Golden Sample versions all others in the 5200's are using

a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version) to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only

uses
a 64 bit bus. You have to go look at their site to learn that. Anyway an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even $50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing

they
have some sort of step up p[rogram to deal with this but know better than

to
hold my breath that they will do anything.

Just Jess


Man, Id be on the phone giving them a serious WTF! They are either stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.



John July 11th 03 09:15 AM

Yeah, I know the feeling. I was forced to replace my
gainward golden sample gf3 that ran at 230/570, with a
fx5200,128 mb...believe me, it doesnt matter much if its a
128 or a 64 bit version, both of them suck!
Even when I overclocked the thing to 300 core and 600 mem, it was a lot
slower that my gf3 not overclocked....about 3/4 in preformance.
What the point of getting all the dx9 stuff, if the game runs
at 3 fps??? Anyway, I got a 9700 pro yesterday,and I am happy as hell! John
"Chimera" wrote in message
...
Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128

bit
mem path is the Golden Sample versions all others in the 5200's are

using
a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version)

to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only

uses
a 64 bit bus. You have to go look at their site to learn that. Anyway an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even

$50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing

they
have some sort of step up p[rogram to deal with this but know better

than
to
hold my breath that they will do anything.

Just Jess


Man, Id be on the phone giving them a serious WTF! They are either stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.





Bratboy July 11th 03 01:08 PM

As much as I've always sworn to never go ATI I wish I could afford one at
this point as Im so mad. Unfortunatly tho when you live on a measly $550 a
month disability like I do its hard to get 2 pennies to rub together let
alone the cost of a card hehe. My GF3 was my last purchase before I lost all
my income so was despratly trying to make do but the 5200 just WONT do hehe.
Think I saw it at one point doing a whooping 3-4 fps in some of the old
3dmark01 High qual tests and it sure didnt handle Nature test even 1/2 as
well as my old gf3 (for a real laugh try the scene with the fish, ran like a
dog on the fx5200. Sigh..... sux to be poor and too old for presents hehe.
Still no matter which way you slice it the fx5200 is a sorry example of
Nvidias capabilities and Gainward, as well as any other company makeing em,
should be ashamed of even producing a 128 meg/64 bit card to begin with.
From what I could find on Nvidia's site everything I did find said for spec
128 meg should use 128 bit not cut its legs off and use 64bit. Wonder if ATI
would trade for the fx5200 if I promised to write a Glowing "I've Converted"
email for it wink
Just Jess

"John" wrote in message
...
Yeah, I know the feeling. I was forced to replace my
gainward golden sample gf3 that ran at 230/570, with a
fx5200,128 mb...believe me, it doesnt matter much if its a
128 or a 64 bit version, both of them suck!
Even when I overclocked the thing to 300 core and 600 mem, it was a lot
slower that my gf3 not overclocked....about 3/4 in preformance.
What the point of getting all the dx9 stuff, if the game runs
at 3 fps??? Anyway, I got a 9700 pro yesterday,and I am happy as hell!

John
"Chimera" wrote in message
...
Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128

bit
mem path is the Golden Sample versions all others in the 5200's are

using
a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version)

to
replace a failing gf3 and it royally sux. Apparently Lite means lite

on
performance. No where on the box do they even dare mention that it

only
uses
a 64 bit bus. You have to go look at their site to learn that. Anyway

an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even

$50
for it I'd be way more ****ed. As is I wrote to Gainward and am

hopeing
they
have some sort of step up p[rogram to deal with this but know better

than
to
hold my breath that they will do anything.

Just Jess


Man, Id be on the phone giving them a serious WTF! They are either

stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.







Stephen Smith July 11th 03 02:22 PM

"Bratboy" wrote in message ...

Buyer beware, apparently the only fx5200 by gainward that uses the 128 bit
mem path is the Golden Sample versions all others in the 5200's are using
a crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version) to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only
uses a 64 bit bus. You have to go look at their site to learn that. Anyway
and some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even $50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing
they have some sort of step up p[rogram to deal with this but know better
than to hold my breath that they will do anything.

Just Jess


Similar situation here. The other day I built up a system for a friend, some
of the bits and pieces he'd bought himself, while the other parts were
donated from his brother-in-law.

One item, an MSI GeForce 5200 FX, caught my eye.

"Ooh an FX, how cool, wonder how they perform compared to my 'slow' GeForce
3?" I thought to myself.

Now, I haven't particularly researched the FX cards but I imagined that the
5200 - being the budget model - would probably perform as well as a
mid-range GeForce 4, so I was looking forward to checking it out.

How wrong I was..... :(

The game at hand was "Max Payne", and my word, was it a "payne" ;-) to play!
Even in 640x480 with a combination of low and medium settings, it felt
_very_ sluggish and generally not a nice experience. A darn-fine game ruined
by the card.

On the other hand, my GeForce 3 handles it beautifully. 1024x768,
medium-high settings (predominantly high) is very responsive and a joy to
play.

The next game I tried (BloodRayne) played slightly better than Max Payne,
although occasionally I could feel it not being too responsive and the
overall animation didn't appear as visually smooth.

So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much. :)



Bratboy July 11th 03 02:54 PM

"Stephen Smith" wrote in message
...
So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much. :)


Sadly I'd be more than happy to of simply replaced my old gf3 with a new of
exact same but OCZ doesnt make em anymore so was no way to get a straight
swap. At this point almost wish Id said send me the gf4MX over this 5200.
Anyway contacted both Gainward and Nvidia about it will see what happens.
Seems to me they, gainward, should be FORCED to post clearly on the box as
to which mem path the enclosed card uses. I went thru every bit of
documentation that was included as well as anything written on the box and
absolutly no where is the speed mentioned which to me is decptive in and of
itself.
Just Jess



Dave July 11th 03 06:17 PM

I'm thinking Nvidia needs to step in and put a stop to all this confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave

"Bratboy" wrote in message ...
"Stephen Smith" wrote in message
...
So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much. :)


Sadly I'd be more than happy to of simply replaced my old gf3 with a new

of
exact same but OCZ doesnt make em anymore so was no way to get a straight
swap. At this point almost wish Id said send me the gf4MX over this 5200.
Anyway contacted both Gainward and Nvidia about it will see what happens.
Seems to me they, gainward, should be FORCED to post clearly on the box as
to which mem path the enclosed card uses. I went thru every bit of
documentation that was included as well as anything written on the box and
absolutly no where is the speed mentioned which to me is decptive in and

of
itself.
Just Jess





Shepİ July 11th 03 08:35 PM

On Fri, 11 Jul 2003 07:54:48 -0600, In this world we created "Bratboy"
wrote :

"Stephen Smith" wrote in message
.. .
So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much. :)


Sadly I'd be more than happy to of simply replaced my old gf3 with a new of
exact same but OCZ doesnt make em anymore so was no way to get a straight
swap. At this point almost wish Id said send me the gf4MX over this 5200.
Anyway contacted both Gainward and Nvidia about it will see what happens.
Seems to me they, gainward, should be FORCED to post clearly on the box as
to which mem path the enclosed card uses. I went thru every bit of
documentation that was included as well as anything written on the box and
absolutly no where is the speed mentioned which to me is decptive in and of
itself.
Just Jess


It appears to me that Nvidia Marketing has dropped the MX and
incorporated it in some of their newer cards ;-)
Think I'll stick with me trusty old G3 Ti 200 O/C :O)



--
Free Windows/PC help,
http://www.geocities.com/sheppola/trouble.html
Free songs download,
http://artists.mp3s.com/artists/17/sheppard.html

Stephen Smith July 11th 03 11:04 PM

"Dave" wrote in message
...
I'm thinking Nvidia needs to step in and put a stop to all this confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave


Dave, I'm interested -- HOW do you "configure" it exactly?!! Especially to
GeForce 3 standard?!

I tried overclocking the 5200 in the system I was using but it made no
/noticable/ difference with Max Payne and BloodRayne, the guinea-pigs. I
even managed to O/C it too far, led to desktop corruption, BloodRayne
wouldn't load, etc, etc.

Now I'm confused........

http://www.msicomputer.com/product/v...l=FX5200_TD128

The image clearly shows a DVI connector and a fan.

How come the MSI 5200 I was using DIDN'T have a DVI connector _or_ fan on
it? (it had just a heatsink) It definately had 128MB, so it wasn't the 64MB
model.

Do they make another version that isn't listed on their website, per chance?

Could it have been the PC causing the duff performance?

Gigabyte GA-7VTXE motherboard
AMD Athlon 1800+ (running at 1533MHz, it claimed)
256MB RAM. [low for BloodRayne, I know]
Windows 98SE, VIA 4-in-1 drivers installed, etc.

Stephen.



Dave July 12th 03 01:27 AM

I meant configured by the manufacturer with a high core clock and fast
memory also clocked high. The Prolink 5200 clocked at 275/500 gets into the
9400 range on 3dmark2001. The GF3 ti200 only gets about 8400. This is with
a high-end system like a 3 gig P4. Most of the 128 bit 5200 cards are
clocked at 250/400 and are about equal to the ti200. None of them can touch
a ti500 which is up in the 10200 range.
Dave

"Stephen Smith" wrote in message
...
"Dave" wrote in message
...
I'm thinking Nvidia needs to step in and put a stop to all this

confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave


Dave, I'm interested -- HOW do you "configure" it exactly?!! Especially to
GeForce 3 standard?!

I tried overclocking the 5200 in the system I was using but it made no
/noticable/ difference with Max Payne and BloodRayne, the guinea-pigs. I
even managed to O/C it too far, led to desktop corruption, BloodRayne
wouldn't load, etc, etc.

Now I'm confused........

http://www.msicomputer.com/product/v...l=FX5200_TD128

The image clearly shows a DVI connector and a fan.

How come the MSI 5200 I was using DIDN'T have a DVI connector _or_ fan on
it? (it had just a heatsink) It definately had 128MB, so it wasn't the

64MB
model.

Do they make another version that isn't listed on their website, per

chance?

Could it have been the PC causing the duff performance?

Gigabyte GA-7VTXE motherboard
AMD Athlon 1800+ (running at 1533MHz, it claimed)
256MB RAM. [low for BloodRayne, I know]
Windows 98SE, VIA 4-in-1 drivers installed, etc.

Stephen.






All times are GMT +1. The time now is 12:46 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
HardwareBanter.com