A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Why are newer cards worse than old ones?



 
 
Thread Tools Display Modes
  #1  
Old August 3rd 03, 10:46 PM
-=Matt=-
external usenet poster
 
Posts: n/a
Default Why are newer cards worse than old ones?

In the 'old' days, when a new card came out you knew it would be better than
the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than Geforce1
etc.

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,
which is why I got the GF4mx on ebay - it was so cheap! I've not recieved it
yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
that there is a faster version of a GF2 out there than my new GF4mx???

Why release these cards if they are inferior to existing models?

--
-=Matt=-

-------------------------------------------------------------
Matts Portrait and Star Wars Page
www.jed1kn1ght.fsnet.co.uk
-------------------------------------------------------------


  #2  
Old August 3rd 03, 11:31 PM
Jibby
external usenet poster
 
Posts: n/a
Default

why dont you use the internet and your brain and do some research on the
different products available instead of bitchin in a newsgroup not all 2l
engines are the same and just cause it sounds faster doesnt mean it is the
different naming refers to the technology used not the speed of the card
"-=Matt=-" wrote in message
...
In the 'old' days, when a new card came out you knew it would be better

than
the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than

Geforce1
etc.

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,
which is why I got the GF4mx on ebay - it was so cheap! I've not recieved

it
yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
that there is a faster version of a GF2 out there than my new GF4mx???

Why release these cards if they are inferior to existing models?

--
-=Matt=-

-------------------------------------------------------------
Matts Portrait and Star Wars Page
www.jed1kn1ght.fsnet.co.uk
-------------------------------------------------------------




  #3  
Old August 4th 03, 12:45 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"-=Matt=-" wrote in message
...


Why release these cards if they are inferior to existing models?



The bottom line is money. they need it, we have it, any product however
incrementally better gets the OEM contract and everyone makes out, except
the enthusiast gamer.

While the enthusiasts drive innovation and dictate mindshare, they do little
to the bototm line of a company except jeopardize it. Also, you are painting
with an overly broad brush, there are distinct advanatages to the newer
cards, and raw speed in all instances isn't exactly one of them [in the
mainstream market].

Look at it this way, it's not that the new cards are so bad, it's that the
older ones are so good.





  #4  
Old August 4th 03, 01:49 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Derek Wildstar" wrote in message
news:[email protected]

"-=Matt=-" wrote in message
...


Why release these cards if they are inferior to existing models?


What? Someone here doesn't remember the Voodoo Rush? Or the TNT2 M64?

The bottom line is money. they need it, we have it, any product however
incrementally better gets the OEM contract and everyone makes out, except
the enthusiast gamer.


It's about maintaining the market niches using the big boy bin rejects. The
best ones go into the flagships, the not-so-good ones into the midrange, the
others that fail more than a pipeline here and there likely get made into
MXes. Not like ATI doesn't do the same thing. ;-) Difference is, with
certain ATI cards, you might get lucky and enable the other four pipelines
without issue...

While the enthusiasts drive innovation and dictate mindshare, they do

little
to the bototm line of a company except jeopardize it.


I beg to differ. They are the ones who help push the envelope, as you say.
This is Good for business. Helps drive the gaming industry too,
hand-in-hand. We'll see when Doom 3 comes out which cards leave the shelves
fastest (most of us I'd imagine are already all set. Last I looked, I
haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come in
the guise of a generous RMA policy, for those who cannot accept personal
responsibility as part of a choice to tweak things...

Also, you are painting
with an overly broad brush, there are distinct advanatages to the newer
cards, and raw speed in all instances isn't exactly one of them [in the
mainstream market]


Certainly not if taking the FX5200 series into question. The only things it
really brings to the table are memory bandwidth and DX9...oh, and a swank
PCI version ;P

Look at it this way, it's not that the new cards are so bad, it's that the
older ones are so good.


I think it's that games are coded with these older cards in mind. The GF4
4x00's are still kicking strong. Given more high-level shader language and
per-pixel lighting, these older cards might not look so good, eh? Literally.
Plus, your performance depends a lot on the rest of your system. That opens
up a whole 'nother barrel o' monkeys. It's not really the average framerate
that matters so much as the minimum in actual gameplay. We're getting to the
point where the video card is the bottleneck again...certainly not much
scaling above 1280x1024 with faster CPUs and the cream-of-the-crop video
card. Somebody release the next generation already! Enough of this milking
of existing product cycle!


  #5  
Old August 4th 03, 04:20 AM
A.C.
external usenet poster
 
Posts: n/a
Default

"Mario Kadastik" wrote in message
...
So most of the crap is going on in the midrange market. I've been mostly
using the budget versions for budget reasons , I've got a friend who
always goes for the high-end cards and he's satisfied as well, but
what's going on in the midfield, that's something I don't know...


My suspicion is that nVidia and ATI (who is also doing the same thing with
the 9500 and 9600) are doing it out of spite. Not sure why.

Why release these cards if they are inferior to existing models?

More money ??? What did you think? People still need cards that can fit
in the midrange market as the high end only makes around 10% of the
whole market... But why didn't they make the FX5600 faster, that's a
good question...


I'm planning on getting an FX5600, despite it being occasionally
outperformed by the much cheaper Ti4200. But I'm gambling that when DX9
games come out my choice will be vindicated. But it probably won't. I know
that logically I should go for a Ti4200, especially since I don't really
care about AA, but sometimes the marketing gets you even when you know
better...


  #6  
Old August 4th 03, 04:54 AM
Chimera
external usenet poster
 
Posts: n/a
Default

I'm planning on getting an FX5600, despite it being occasionally
outperformed by the much cheaper Ti4200. But I'm gambling that when DX9
games come out my choice will be vindicated. But it probably won't. I

know
that logically I should go for a Ti4200, especially since I don't really
care about AA, but sometimes the marketing gets you even when you know
better...



By the time DX9 games come along, you'll probably have good memories of your
FX5600, as you frag away with your new DX10 card


  #7  
Old August 4th 03, 05:42 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"Dave" wrote in message
news:[email protected]

I beg to differ. They are the ones who help push the envelope, as you say.
This is Good for business. Helps drive the gaming industry too,
hand-in-hand. We'll see when Doom 3 comes out which cards leave the

shelves
fastest (most of us I'd imagine are already all set. Last I looked, I
haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come

in
the guise of a generous RMA policy, for those who cannot accept personal
responsibility as part of a choice to tweak things...


I'm going to reinforce my point, that enthusiasts' aren't much for the
bottom line. Firstly, look at the percentage of sales that the top of the
line cards have versus the rest of the market, it's miniscule and it's not
very profitable. In fact, the recent FX adventure was a very unprofitable
venture. Huge losses for nvidia due to an aggressive push for the untested
smaller fab process. How do I know? On line press and company
communications. Only the enthusiasts cared about the FX5800, and only they
will even know it existed, the mainstream will never think of it again after
this post.

Now, what about the lesser cards? According to dr. watson and directx diag
reports submitted to microsoft and user submitted specs, the actual
percentage of PC's with top tier video cards is in the low single digits.
Only when you include the ti4200's does that percentage increase into the
teens. Not a whole lot, when it's about 1 in 8 have a ti4200 or better. And
that's the *savvy* user who is considered to have better hardware than
average, so the actual numbers are likely less.

Nvidia is a behemoth compared to ATI, and they make the bulk of their money
from OEM sales of average-performing current hardware, not titillating the
hard core. Sad but true. I'm not trying to play 'I have a corporate secret'
either, most of this info has made it's way to the public domain, as it
should, but for the most timely info, try some marketing sites like
npdtechworld.com. While it is a pay service, is certainly has value if you
have to make business decisions based on what's selling when.



  #8  
Old August 4th 03, 05:54 AM
Chimera
external usenet poster
 
Posts: n/a
Default

not to slug your argument or anything, but my experience is that a very low
percentage of dr watson style dumps ever get sent to MS. It may be that the
demographic is still quite even, but it may also be that the 'savvy' users,
when confronted with a crash & core dump simply choose to ignore it, put it
down to the 'Windows experience'.


  #9  
Old August 4th 03, 06:49 AM
Mark Leuck
external usenet poster
 
Posts: n/a
Default


"A.C." wrote in message
...
"Mario Kadastik" wrote in message
...
So most of the crap is going on in the midrange market. I've been mostly
using the budget versions for budget reasons , I've got a friend who
always goes for the high-end cards and he's satisfied as well, but
what's going on in the midfield, that's something I don't know...


My suspicion is that nVidia and ATI (who is also doing the same thing with
the 9500 and 9600) are doing it out of spite. Not sure why.


The 9600 was a refresh to add DX-9, it also has far fewer transistors than
the 9500 making it much cheaper to build which is why they came out with it

As far as Nvidia if I recall the original GF1 wasn't much faster than the
TNT Ultra, and the GF-2 was slightly slower in the beginning than the GF1.

I'm planning on getting an FX5600, despite it being occasionally
outperformed by the much cheaper Ti4200. But I'm gambling that when DX9
games come out my choice will be vindicated. But it probably won't. I

know
that logically I should go for a Ti4200, especially since I don't really
care about AA, but sometimes the marketing gets you even when you know
better...


At least get the ultra


  #10  
Old August 4th 03, 07:06 AM
Chimera
external usenet poster
 
Posts: n/a
Default

The 9600 was a refresh to add DX-9, it also has far fewer transistors than
the 9500 making it much cheaper to build which is why they came out with

it

Ive seen a lot more read into it than just that. For the record, the 9500
was a DX9 generation card. The other real difference between the 9500 &
9600 seems that ATI 'repositioned' their products just slightly in relation
to each other, and made the gap between the 9600 & 9800 more of a step up.
By giving the 9600 series 4 pixel pipes & 128-bit bus, and the 9800 series 8
pixel pipes and 256-bit bus, they ensure that a lesser tier card will
struggle to beat its more expensive seniors, even with aggressive
overclocking.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Different graphics cards for my P4PE Mikael Asus Motherboards 11 February 5th 05 04:37 PM
GA 8KNXP don´t recognizes any pci cards Stefan Schneider Gigabyte Motherboards 1 November 30th 03 11:41 PM
Do current ATI cards support DVI at 1600 x 1200? J.Clarke Ati Videocards 1 October 26th 03 10:51 PM
Half-Life 2 - Nvidia cards can't do FSAA (?!?!) WTH Ati Videocards 27 July 27th 03 05:39 AM
ATI Cards with 32MB and above Ziki99 Ati Videocards 1 July 7th 03 09:43 PM


All times are GMT +1. The time now is 03:11 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
Copyright ©2004-2022 HardwareBanter.
The comments are property of their posters.