A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Serious waste of chips and PCB with fx5200



 
 
Thread Tools Display Modes
  #1  
Old July 10th 03, 11:57 PM
Bratboy
external usenet poster
 
Posts: n/a
Default Serious waste of chips and PCB with fx5200

Buyer beware, apparently the only fx5200 by gainward that uses the 128 bit
mem path is the Golden Sample versions all others in the 5200's are using a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version) to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only uses
a 64 bit bus. You have to go look at their site to learn that. Anyway an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even $50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing they
have some sort of step up p[rogram to deal with this but know better than to
hold my breath that they will do anything.

Just Jess


  #2  
Old July 11th 03, 07:11 AM
Chimera
external usenet poster
 
Posts: n/a
Default

Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128 bit
mem path is the Golden Sample versions all others in the 5200's are using

a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version) to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only

uses
a 64 bit bus. You have to go look at their site to learn that. Anyway an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even $50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing

they
have some sort of step up p[rogram to deal with this but know better than

to
hold my breath that they will do anything.

Just Jess


Man, Id be on the phone giving them a serious WTF! They are either stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.


  #3  
Old July 11th 03, 09:15 AM
John
external usenet poster
 
Posts: n/a
Default

Yeah, I know the feeling. I was forced to replace my
gainward golden sample gf3 that ran at 230/570, with a
fx5200,128 mb...believe me, it doesnt matter much if its a
128 or a 64 bit version, both of them suck!
Even when I overclocked the thing to 300 core and 600 mem, it was a lot
slower that my gf3 not overclocked....about 3/4 in preformance.
What the point of getting all the dx9 stuff, if the game runs
at 3 fps??? Anyway, I got a 9700 pro yesterday,and I am happy as hell! John
"Chimera" wrote in message
news
Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128

bit
mem path is the Golden Sample versions all others in the 5200's are

using
a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version)

to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only

uses
a 64 bit bus. You have to go look at their site to learn that. Anyway an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even

$50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing

they
have some sort of step up p[rogram to deal with this but know better

than
to
hold my breath that they will do anything.

Just Jess


Man, Id be on the phone giving them a serious WTF! They are either stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.




  #4  
Old July 11th 03, 01:08 PM
Bratboy
external usenet poster
 
Posts: n/a
Default

As much as I've always sworn to never go ATI I wish I could afford one at
this point as Im so mad. Unfortunatly tho when you live on a measly $550 a
month disability like I do its hard to get 2 pennies to rub together let
alone the cost of a card hehe. My GF3 was my last purchase before I lost all
my income so was despratly trying to make do but the 5200 just WONT do hehe.
Think I saw it at one point doing a whooping 3-4 fps in some of the old
3dmark01 High qual tests and it sure didnt handle Nature test even 1/2 as
well as my old gf3 (for a real laugh try the scene with the fish, ran like a
dog on the fx5200. Sigh..... sux to be poor and too old for presents hehe.
Still no matter which way you slice it the fx5200 is a sorry example of
Nvidias capabilities and Gainward, as well as any other company makeing em,
should be ashamed of even producing a 128 meg/64 bit card to begin with.
From what I could find on Nvidia's site everything I did find said for spec
128 meg should use 128 bit not cut its legs off and use 64bit. Wonder if ATI
would trade for the fx5200 if I promised to write a Glowing "I've Converted"
email for it wink
Just Jess

"John" wrote in message
...
Yeah, I know the feeling. I was forced to replace my
gainward golden sample gf3 that ran at 230/570, with a
fx5200,128 mb...believe me, it doesnt matter much if its a
128 or a 64 bit version, both of them suck!
Even when I overclocked the thing to 300 core and 600 mem, it was a lot
slower that my gf3 not overclocked....about 3/4 in preformance.
What the point of getting all the dx9 stuff, if the game runs
at 3 fps??? Anyway, I got a 9700 pro yesterday,and I am happy as hell!

John
"Chimera" wrote in message
news
Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128

bit
mem path is the Golden Sample versions all others in the 5200's are

using
a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version)

to
replace a failing gf3 and it royally sux. Apparently Lite means lite

on
performance. No where on the box do they even dare mention that it

only
uses
a 64 bit bus. You have to go look at their site to learn that. Anyway

an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even

$50
for it I'd be way more ****ed. As is I wrote to Gainward and am

hopeing
they
have some sort of step up p[rogram to deal with this but know better

than
to
hold my breath that they will do anything.

Just Jess


Man, Id be on the phone giving them a serious WTF! They are either

stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.






  #5  
Old July 12th 03, 05:13 AM
Larry Roberts
external usenet poster
 
Posts: n/a
Default

I'm in the same "boat" as you. I do work on ppls computers on
the side for extra cash which pays for upgrades. Most of them ask why
I don't go into business doing this, but what they fail to notice is
how long it takes me to do something as simple as pulling a videocard
out, and replacing it. Most can do this within 20min, or so. It takes
me an hour, or more from the time I get the case off till I finish
testing the card. Then I still have to get the case cover back on.
Besides.. These people know nothing about how a PC works. They can
barly turn on the thing. To them, I'm a guru. Most 13yr olds know what
I know, it's just that there usally to lazy to work on other ppls
stuff. Another thing is that no computer repair business in the area
has stayed open for more than a year, or two, except for one, but they
do alot of contracts with local industry. Their private customer sales
are almost none.
I "fixed" my cousin's computer a few weeks ago, and scored a
21" monitor. All I did was format, reinstalled OS, and sottware, and
in retrurn I got the monitor. His parents are retired, and clean
offices for extra income. The office was upgrading the old PII
workstations. My cousin got a full Intergraph Worstation PC with dual
PII 300Mhz CPU with a 19" monitor for free. They where gona throw it
out. His dad asked if they had another monitor, and they gave him the
21" as well.
I was thinking of getting a Prolink FX5200 which is clocked
higher than the other non-Ultra cards out, but I'm thinking maybe
I'll save a while longer till I can afford an FX 5600. It'll take me
another 3 months I guess, but it will feel more of a worthwhile
purchase since I already have a GF3 Ti200.

On Fri, 11 Jul 2003 06:08:08 -0600, "Bratboy" wrote:

As much as I've always sworn to never go ATI I wish I could afford one at
this point as Im so mad. Unfortunatly tho when you live on a measly $550 a
month disability like I do its hard to get 2 pennies to rub together let
alone the cost of a card hehe. My GF3 was my last purchase before I lost all
my income so was despratly trying to make do but the 5200 just WONT do hehe.
Think I saw it at one point doing a whooping 3-4 fps in some of the old
3dmark01 High qual tests and it sure didnt handle Nature test even 1/2 as
well as my old gf3 (for a real laugh try the scene with the fish, ran like a
dog on the fx5200. Sigh..... sux to be poor and too old for presents hehe.
Still no matter which way you slice it the fx5200 is a sorry example of
Nvidias capabilities and Gainward, as well as any other company makeing em,
should be ashamed of even producing a 128 meg/64 bit card to begin with.
From what I could find on Nvidia's site everything I did find said for spec
128 meg should use 128 bit not cut its legs off and use 64bit. Wonder if ATI
would trade for the fx5200 if I promised to write a Glowing "I've Converted"
email for it wink
Just Jess

"John" wrote in message
...
Yeah, I know the feeling. I was forced to replace my
gainward golden sample gf3 that ran at 230/570, with a
fx5200,128 mb...believe me, it doesnt matter much if its a
128 or a 64 bit version, both of them suck!
Even when I overclocked the thing to 300 core and 600 mem, it was a lot
slower that my gf3 not overclocked....about 3/4 in preformance.
What the point of getting all the dx9 stuff, if the game runs
at 3 fps??? Anyway, I got a 9700 pro yesterday,and I am happy as hell!

John
"Chimera" wrote in message
news
Bratboy wrote:
Buyer beware, apparently the only fx5200 by gainward that uses the 128

bit
mem path is the Golden Sample versions all others in the 5200's are

using
a
crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version)

to
replace a failing gf3 and it royally sux. Apparently Lite means lite

on
performance. No where on the box do they even dare mention that it

only
uses
a 64 bit bus. You have to go look at their site to learn that. Anyway

an
some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even

$50
for it I'd be way more ****ed. As is I wrote to Gainward and am

hopeing
they
have some sort of step up p[rogram to deal with this but know better

than
to
hold my breath that they will do anything.

Just Jess

Man, Id be on the phone giving them a serious WTF! They are either

stingy
or stupid, and there is no way that an entry level GFFX 5200 is an
equivalent to a GF3.







  #6  
Old July 11th 03, 02:22 PM
Stephen Smith
external usenet poster
 
Posts: n/a
Default

"Bratboy" wrote in message ...

Buyer beware, apparently the only fx5200 by gainward that uses the 128 bit
mem path is the Golden Sample versions all others in the 5200's are using
a crappy 64 bit pathway which absolutly kills the card as far as ANY
performance at all. A company sent me a Gainward fx5200 (lite version) to
replace a failing gf3 and it royally sux. Apparently Lite means lite on
performance. No where on the box do they even dare mention that it only
uses a 64 bit bus. You have to go look at their site to learn that. Anyway
and some tests using both cards:

3DMark01 SE
1280x1024, 32 bit, 2xAA and 2xAF

Old GF3 - 2698, 2695, 2699
FX5200 - 1058, 1050, 1055

All in all so **** poor a performer I wouldnt even wish it on my worst
enemy. Really glad the company sent it to me free as if I'd paid even $50
for it I'd be way more ****ed. As is I wrote to Gainward and am hopeing
they have some sort of step up p[rogram to deal with this but know better
than to hold my breath that they will do anything.

Just Jess


Similar situation here. The other day I built up a system for a friend, some
of the bits and pieces he'd bought himself, while the other parts were
donated from his brother-in-law.

One item, an MSI GeForce 5200 FX, caught my eye.

"Ooh an FX, how cool, wonder how they perform compared to my 'slow' GeForce
3?" I thought to myself.

Now, I haven't particularly researched the FX cards but I imagined that the
5200 - being the budget model - would probably perform as well as a
mid-range GeForce 4, so I was looking forward to checking it out.

How wrong I was.....

The game at hand was "Max Payne", and my word, was it a "payne" ;-) to play!
Even in 640x480 with a combination of low and medium settings, it felt
_very_ sluggish and generally not a nice experience. A darn-fine game ruined
by the card.

On the other hand, my GeForce 3 handles it beautifully. 1024x768,
medium-high settings (predominantly high) is very responsive and a joy to
play.

The next game I tried (BloodRayne) played slightly better than Max Payne,
although occasionally I could feel it not being too responsive and the
overall animation didn't appear as visually smooth.

So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much.


  #7  
Old July 11th 03, 02:54 PM
Bratboy
external usenet poster
 
Posts: n/a
Default

"Stephen Smith" wrote in message
...
So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much.


Sadly I'd be more than happy to of simply replaced my old gf3 with a new of
exact same but OCZ doesnt make em anymore so was no way to get a straight
swap. At this point almost wish Id said send me the gf4MX over this 5200.
Anyway contacted both Gainward and Nvidia about it will see what happens.
Seems to me they, gainward, should be FORCED to post clearly on the box as
to which mem path the enclosed card uses. I went thru every bit of
documentation that was included as well as anything written on the box and
absolutly no where is the speed mentioned which to me is decptive in and of
itself.
Just Jess


  #8  
Old July 11th 03, 06:17 PM
Dave
external usenet poster
 
Posts: n/a
Default

I'm thinking Nvidia needs to step in and put a stop to all this confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave

"Bratboy" wrote in message ...
"Stephen Smith" wrote in message
...
So now I know to avoid the 5200 like the plague!

Stephen, who'll stick with his trusty GF3, thank-you-very-much.


Sadly I'd be more than happy to of simply replaced my old gf3 with a new

of
exact same but OCZ doesnt make em anymore so was no way to get a straight
swap. At this point almost wish Id said send me the gf4MX over this 5200.
Anyway contacted both Gainward and Nvidia about it will see what happens.
Seems to me they, gainward, should be FORCED to post clearly on the box as
to which mem path the enclosed card uses. I went thru every bit of
documentation that was included as well as anything written on the box and
absolutly no where is the speed mentioned which to me is decptive in and

of
itself.
Just Jess




  #9  
Old July 11th 03, 11:04 PM
Stephen Smith
external usenet poster
 
Posts: n/a
Default

"Dave" wrote in message
...
I'm thinking Nvidia needs to step in and put a stop to all this confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave


Dave, I'm interested -- HOW do you "configure" it exactly?!! Especially to
GeForce 3 standard?!

I tried overclocking the 5200 in the system I was using but it made no
/noticable/ difference with Max Payne and BloodRayne, the guinea-pigs. I
even managed to O/C it too far, led to desktop corruption, BloodRayne
wouldn't load, etc, etc.

Now I'm confused........

http://www.msicomputer.com/product/v...l=FX5200_TD128

The image clearly shows a DVI connector and a fan.

How come the MSI 5200 I was using DIDN'T have a DVI connector _or_ fan on
it? (it had just a heatsink) It definately had 128MB, so it wasn't the 64MB
model.

Do they make another version that isn't listed on their website, per chance?

Could it have been the PC causing the duff performance?

Gigabyte GA-7VTXE motherboard
AMD Athlon 1800+ (running at 1533MHz, it claimed)
256MB RAM. [low for BloodRayne, I know]
Windows 98SE, VIA 4-in-1 drivers installed, etc.

Stephen.


  #10  
Old July 12th 03, 01:27 AM
Dave
external usenet poster
 
Posts: n/a
Default

I meant configured by the manufacturer with a high core clock and fast
memory also clocked high. The Prolink 5200 clocked at 275/500 gets into the
9400 range on 3dmark2001. The GF3 ti200 only gets about 8400. This is with
a high-end system like a 3 gig P4. Most of the 128 bit 5200 cards are
clocked at 250/400 and are about equal to the ti200. None of them can touch
a ti500 which is up in the 10200 range.
Dave

"Stephen Smith" wrote in message
...
"Dave" wrote in message
...
I'm thinking Nvidia needs to step in and put a stop to all this

confusion.
It's giving their 5200 chip a bad name. If properly configured, it does
have the ability to best the GF3.
Dave


Dave, I'm interested -- HOW do you "configure" it exactly?!! Especially to
GeForce 3 standard?!

I tried overclocking the 5200 in the system I was using but it made no
/noticable/ difference with Max Payne and BloodRayne, the guinea-pigs. I
even managed to O/C it too far, led to desktop corruption, BloodRayne
wouldn't load, etc, etc.

Now I'm confused........

http://www.msicomputer.com/product/v...l=FX5200_TD128

The image clearly shows a DVI connector and a fan.

How come the MSI 5200 I was using DIDN'T have a DVI connector _or_ fan on
it? (it had just a heatsink) It definately had 128MB, so it wasn't the

64MB
model.

Do they make another version that isn't listed on their website, per

chance?

Could it have been the PC causing the duff performance?

Gigabyte GA-7VTXE motherboard
AMD Athlon 1800+ (running at 1533MHz, it claimed)
256MB RAM. [low for BloodRayne, I know]
Windows 98SE, VIA 4-in-1 drivers installed, etc.

Stephen.




 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 04:18 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.