A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Why are newer cards worse than old ones?



 
 
Thread Tools Display Modes
  #1  
Old August 3rd 03, 10:46 PM
-=Matt=-
external usenet poster
 
Posts: n/a
Default Why are newer cards worse than old ones?

In the 'old' days, when a new card came out you knew it would be better than
the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than Geforce1
etc.

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,
which is why I got the GF4mx on ebay - it was so cheap! I've not recieved it
yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
that there is a faster version of a GF2 out there than my new GF4mx???

Why release these cards if they are inferior to existing models?

--
-=Matt=-

-------------------------------------------------------------
Matts Portrait and Star Wars Page
www.jed1kn1ght.fsnet.co.uk
-------------------------------------------------------------


  #2  
Old August 3rd 03, 11:31 PM
Jibby
external usenet poster
 
Posts: n/a
Default

why dont you use the internet and your brain and do some research on the
different products available instead of bitchin in a newsgroup not all 2l
engines are the same and just cause it sounds faster doesnt mean it is the
different naming refers to the technology used not the speed of the card
"-=Matt=-" wrote in message
...
In the 'old' days, when a new card came out you knew it would be better

than
the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than

Geforce1
etc.

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,
which is why I got the GF4mx on ebay - it was so cheap! I've not recieved

it
yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
that there is a faster version of a GF2 out there than my new GF4mx???

Why release these cards if they are inferior to existing models?

--
-=Matt=-

-------------------------------------------------------------
Matts Portrait and Star Wars Page
www.jed1kn1ght.fsnet.co.uk
-------------------------------------------------------------




  #3  
Old August 4th 03, 12:45 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"-=Matt=-" wrote in message
...


Why release these cards if they are inferior to existing models?



The bottom line is money. they need it, we have it, any product however
incrementally better gets the OEM contract and everyone makes out, except
the enthusiast gamer.

While the enthusiasts drive innovation and dictate mindshare, they do little
to the bototm line of a company except jeopardize it. Also, you are painting
with an overly broad brush, there are distinct advanatages to the newer
cards, and raw speed in all instances isn't exactly one of them [in the
mainstream market].

Look at it this way, it's not that the new cards are so bad, it's that the
older ones are so good.





  #4  
Old August 4th 03, 01:49 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Derek Wildstar" wrote in message
news:e8hXa.57305$uu5.5615@sccrnsc04...

"-=Matt=-" wrote in message
...


Why release these cards if they are inferior to existing models?


What? Someone here doesn't remember the Voodoo Rush? Or the TNT2 M64?

The bottom line is money. they need it, we have it, any product however
incrementally better gets the OEM contract and everyone makes out, except
the enthusiast gamer.


It's about maintaining the market niches using the big boy bin rejects. The
best ones go into the flagships, the not-so-good ones into the midrange, the
others that fail more than a pipeline here and there likely get made into
MXes. Not like ATI doesn't do the same thing. ;-) Difference is, with
certain ATI cards, you might get lucky and enable the other four pipelines
without issue...

While the enthusiasts drive innovation and dictate mindshare, they do

little
to the bototm line of a company except jeopardize it.


I beg to differ. They are the ones who help push the envelope, as you say.
This is Good for business. Helps drive the gaming industry too,
hand-in-hand. We'll see when Doom 3 comes out which cards leave the shelves
fastest (most of us I'd imagine are already all set. Last I looked, I
haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come in
the guise of a generous RMA policy, for those who cannot accept personal
responsibility as part of a choice to tweak things...

Also, you are painting
with an overly broad brush, there are distinct advanatages to the newer
cards, and raw speed in all instances isn't exactly one of them [in the
mainstream market]


Certainly not if taking the FX5200 series into question. The only things it
really brings to the table are memory bandwidth and DX9...oh, and a swank
PCI version ;P

Look at it this way, it's not that the new cards are so bad, it's that the
older ones are so good.


I think it's that games are coded with these older cards in mind. The GF4
4x00's are still kicking strong. Given more high-level shader language and
per-pixel lighting, these older cards might not look so good, eh? Literally.
Plus, your performance depends a lot on the rest of your system. That opens
up a whole 'nother barrel o' monkeys. It's not really the average framerate
that matters so much as the minimum in actual gameplay. We're getting to the
point where the video card is the bottleneck again...certainly not much
scaling above 1280x1024 with faster CPUs and the cream-of-the-crop video
card. Somebody release the next generation already! Enough of this milking
of existing product cycle!


  #5  
Old August 4th 03, 05:42 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"Dave" wrote in message
news:S4iXa.41432$Oz4.11794@rwcrnsc54...

I beg to differ. They are the ones who help push the envelope, as you say.
This is Good for business. Helps drive the gaming industry too,
hand-in-hand. We'll see when Doom 3 comes out which cards leave the

shelves
fastest (most of us I'd imagine are already all set. Last I looked, I
haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come

in
the guise of a generous RMA policy, for those who cannot accept personal
responsibility as part of a choice to tweak things...


I'm going to reinforce my point, that enthusiasts' aren't much for the
bottom line. Firstly, look at the percentage of sales that the top of the
line cards have versus the rest of the market, it's miniscule and it's not
very profitable. In fact, the recent FX adventure was a very unprofitable
venture. Huge losses for nvidia due to an aggressive push for the untested
smaller fab process. How do I know? On line press and company
communications. Only the enthusiasts cared about the FX5800, and only they
will even know it existed, the mainstream will never think of it again after
this post.

Now, what about the lesser cards? According to dr. watson and directx diag
reports submitted to microsoft and user submitted specs, the actual
percentage of PC's with top tier video cards is in the low single digits.
Only when you include the ti4200's does that percentage increase into the
teens. Not a whole lot, when it's about 1 in 8 have a ti4200 or better. And
that's the *savvy* user who is considered to have better hardware than
average, so the actual numbers are likely less.

Nvidia is a behemoth compared to ATI, and they make the bulk of their money
from OEM sales of average-performing current hardware, not titillating the
hard core. Sad but true. I'm not trying to play 'I have a corporate secret'
either, most of this info has made it's way to the public domain, as it
should, but for the most timely info, try some marketing sites like
npdtechworld.com. While it is a pay service, is certainly has value if you
have to make business decisions based on what's selling when.



  #6  
Old August 4th 03, 05:54 AM
Chimera
external usenet poster
 
Posts: n/a
Default

not to slug your argument or anything, but my experience is that a very low
percentage of dr watson style dumps ever get sent to MS. It may be that the
demographic is still quite even, but it may also be that the 'savvy' users,
when confronted with a crash & core dump simply choose to ignore it, put it
down to the 'Windows experience'.


  #7  
Old August 4th 03, 08:16 PM
Dave
external usenet poster
 
Posts: n/a
Default


"Derek Wildstar" wrote in message
news:cvlXa.59006$uu5.6346@sccrnsc04...

"Dave" wrote in message
news:S4iXa.41432$Oz4.11794@rwcrnsc54...

I beg to differ. They are the ones who help push the envelope, as you

say.
This is Good for business. Helps drive the gaming industry too,
hand-in-hand. We'll see when Doom 3 comes out which cards leave the

shelves
fastest (most of us I'd imagine are already all set. Last I looked, I
haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come

in
the guise of a generous RMA policy, for those who cannot accept personal
responsibility as part of a choice to tweak things...


I'm going to reinforce my point, that enthusiasts' aren't much for the
bottom line.


Hmm, originally you stated that they were a *jeopardy* to the bottom line.
THIS is what sounds fishy. That they *aren't much* for the bottom line is a
given. Given the miniscule ratio of enthusiasts to average joes, I would say
it's a write-off...

Firstly, look at the percentage of sales that the top of the
line cards have versus the rest of the market, it's miniscule and it's not
very profitable.


At current pricing structures I have a little tough time believing this ;-).
Maybe if the inventory is rotting on the shelves, perhaps...or being passed
up in favor of a currently better solution (ATI, perhaps?)...sure, I agree
with you completely percentage-wise...

In fact, the recent FX adventure was a very unprofitable
venture.


And the enthusiasts' market drove this to the brink? I'd say this can pretty
much be put down as a Nvidia f$ckup. Poor execution.

Huge losses for nvidia due to an aggressive push for the untested
smaller fab process. How do I know? On line press and company
communications.


Pretty much old news by now, I'd imagine...some of us knew it well in
advance of release. I certainly knew what to expect, and was scarcely
disappointed (I had already dumped my Nvidia stock...made a bit too...). It
made for wonderful satire while it lasted...

Only the enthusiasts cared about the FX5800, and only they
will even know it existed, the mainstream will never think of it again

after
this post.


Thank God! It deserves to get buried, swept under the rug with the dust
bunnies caught in its central HVAC-sized plenum, or left in the shed with
the rest of the gardening appliances. They gambled, they lost. The way
Nvidia handled the whole affair did the job for them on their own bottom
line...and rightfully so. The FX5800 was a joke. No wonder they lost on it.
They drove thee ol' wagon to market, seņor, even if she were sheddin' parts
all thee way down thee cobblestone pike an' thee ol' donkey she died when
she got there. Now tell me, who forced Pedro into the wagon? Of course it
was solely Nvidia's decision to continue developing, promoting and retailing
this faux pas. Their loss was by their own hands. Think it would have been
much different if the donkey was stillborn and they skipped an iteration of
current tech, waiting until refinements to recycle it for another $400+
stretch? That they would even allow such a prospect to intrude upon their
bottom line thusly is likely the subject of several after-hours round-table
sessions among the shareholders committee. This is where this half-baked
theory of "Voodoo Economics" signs me right off. They almost pulled a 3dfx
on that one. Certain other decisions they've made have hurt their bottom
line as well. To say that the lunatic fringe market has precipitated this
state of affairs is a little like putting Descartes before the horse...it
conveniently factors any decision Nvidia made about how to *effectively*
cater to this minority market right out of the picture.

Now, what about the lesser cards? According to dr. watson and directx diag
reports submitted to microsoft and user submitted specs, the actual
percentage of PC's with top tier video cards is in the low single digits.


I'll accept that demographic, although as someone pointed out it's a little
tough to tell because the savvy enthusiast might not even bother with
submitting reports to Microsoft. Sure. Nothing new here. Most of the average
computer shoppers will get a fair-to-middlin' OEM card, absolutely. I even
get to replace a few of them in my travels...

Only when you include the ti4200's does that percentage increase into the
teens. Not a whole lot, when it's about 1 in 8 have a ti4200 or better.

And
that's the *savvy* user who is considered to have better hardware than
average, so the actual numbers are likely less.


Tough to tell...not really enough information...all this really says is how
many people with top-tier cards bother submitting reports and specs. If you
can extrapolate from this the layout of the entire market, more power to ya!
Myself, I'd wanna see cumulative averages of retail sales figures since gold
date (maybe even peaks around certain game releases), inventory manifests,
etc...I can tell you one thing: the majority of card installs I've done are
certainly midrange hardware (that's where things could be a little confusing
for some folks right now...). It's not the relative percentages that are in
question here, Derek. It's the assertion that (catering to) the enthusiasts'
market is BAD for the bottom line, lest we lose sight of the underlying
issue in this flurry of statistics ;^). Yesterday's top shelf becomes
tomorrow's bottom line practically annually, and it's a little hard to
imagine that even with the attendant price drops into the range of the
reasonable that anyone's really eatin' it here. Am I missing something?

Nvidia is a behemoth compared to ATI, and they make the bulk of their

money
from OEM sales of average-performing current hardware, not titillating the
hard core. Sad but true.


Not sad. Merely the way things are and have always been. But from here, the
suggestion that covering the hardcore market is a detriment to the bottom
line is a stretch. It's all in the execution. ATI won out on this round.
Nvidia's "woes" we can lay right at their own front door...

I'm not trying to play 'I have a corporate secret'
either, most of this info has made it's way to the public domain, as it
should, but for the most timely info, try some marketing sites like
npdtechworld.com. While it is a pay service, is certainly has value if you
have to make business decisions based on what's selling when.


Thank you for your kind suggestions...;-P. You know I'll rip you a new one
if you insist on further patronization ;-)...nothing personal of course, I
don't dislike anyone here at all, certainly not you at any rate, it's all
in good fun!




  #8  
Old August 5th 03, 12:30 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"Dave" wrote in message
news:wiyXa.65920$YN5.49523@sccrnsc01...


Hmm, originally you stated that they were a *jeopardy* to the bottom line.
THIS is what sounds fishy. That they *aren't much* for the bottom line is

a
given. Given the miniscule ratio of enthusiasts to average joes, I would

say
it's a write-off...


In order to further the discussion for the folks at home:

I'm not going to back-pedal from the jeopardy comment, I still believe that
catering to the enthusiast is a bad financial idea in theory and in
practice. However, while there have been notable self-destructs in the
vidcard world, Vodoo6000, FX5800, it's arguable that external forces were
more deterimental than the internal decisions to pursue the top-tier perhaps
unwisely. As much as I respect the 3dfx engineers, their input into the V6
and the FX was in fact, jeopardizing the bottom line, by pushing a product
that required an advanced fab process for it to be a marketing and
performing sucess. Compare it to the past practice of nvidia and their
incremental upgrade process, three design teams working off the the advances
of the others, *waiting* until tech caught up to them, rather than trying to
drive tech forward. The GeForces were predictable and robust performers, all
based on mainstream concepts.

As far as patronization goes, I wouldn't dream of it. Only those who are
used to that sort of thing infer that tone from a post.






  #9  
Old August 4th 03, 12:30 PM
-=Matt=-
external usenet poster
 
Posts: n/a
Default




"Derek Wildstar" wrote in message
news:e8hXa.57305$uu5.5615@sccrnsc04...

"-=Matt=-" wrote in message
...


Why release these cards if they are inferior to existing models?



The bottom line is money. they need it, we have it, any product however
incrementally better gets the OEM contract and everyone makes out, except
the enthusiast gamer.

While the enthusiasts drive innovation and dictate mindshare, they do

little
to the bototm line of a company except jeopardize it. Also, you are

painting
with an overly broad brush, there are distinct advanatages to the newer
cards, and raw speed in all instances isn't exactly one of them [in the
mainstream market].


I think that's it. Or at least that's how what I'm going to believe! I guess
each new generation (GF3 GF4 FX etc) brings with it new cutting edge
features like Bump mapping anti arse, DX9 or whatever, which the older cards
don't have, but to make it a budget card, they cut back on memory and
transistors etc. So I guess if you're playing a game which doesn't support
these new features, then a top of the range old gen will probably beat a new
gen!

My TNT2u boasts detail textures and 32bit color on my PIII500, better than
the Vd3, but I never enable them, as it runs too slow, despite looking
better!

-=Matt=-





Look at it this way, it's not that the new cards are so bad, it's that the
older ones are so good.







  #10  
Old August 4th 03, 04:50 PM
John Russell
external usenet poster
 
Posts: n/a
Default


"-=Matt=-" wrote in message
...
In the 'old' days, when a new card came out you knew it would be better

than
the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than

Geforce1
etc.

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,
which is why I got the GF4mx on ebay - it was so cheap! I've not recieved

it
yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
that there is a faster version of a GF2 out there than my new GF4mx???

Why release these cards if they are inferior to existing models?

--
-=Matt=-

-------------------------------------------------------------
Matts Portrait and Star Wars Page
www.jed1kn1ght.fsnet.co.uk
-------------------------------------------------------------



I think the problems is that they don't design the cheapest product in a new
family first. They produce the new flagship model and everyone agrees it's
the faster than the best card in the old family. They then set about
producing cheaper versions to create a new family of products. This usally
means halfing pipelines etc. They cannot guaretee that this process will
make the cheaper cards faster than some of the old ones.
The simple answer is to not keep upgrading cards by buying the cheapest card
in every family. It's better to stick to a restricted budget by waiting
until the flagship model comes down in price. Being 6-12 months behind is no
great loss since little software will be available initially to exploit any
new card.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Different graphics cards for my P4PE Mikael Asus Motherboards 11 February 5th 05 04:37 PM
GA 8KNXP donīt recognizes any pci cards Stefan Schneider Gigabyte Motherboards 1 November 30th 03 11:41 PM
Do current ATI cards support DVI at 1600 x 1200? J.Clarke Ati Videocards 1 October 26th 03 10:51 PM
Half-Life 2 - Nvidia cards can't do FSAA (?!?!) WTH Ati Videocards 27 July 27th 03 05:39 AM
ATI Cards with 32MB and above Ziki99 Ati Videocards 1 July 7th 03 09:43 PM


All times are GMT +1. The time now is 05:04 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright Đ2004-2024 HardwareBanter.
The comments are property of their posters.