A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Why are newer cards worse than old ones?



 
 
Thread Tools Display Modes
  #11  
Old August 4th 03, 12:30 PM
-=Matt=-
external usenet poster
 
Posts: n/a
Default




"Derek Wildstar" wrote in message
news:e8hXa.57305$uu5.5615@sccrnsc04...

"-=Matt=-" wrote in message
...


Why release these cards if they are inferior to existing models?



The bottom line is money. they need it, we have it, any product however
incrementally better gets the OEM contract and everyone makes out, except
the enthusiast gamer.

While the enthusiasts drive innovation and dictate mindshare, they do

little
to the bototm line of a company except jeopardize it. Also, you are

painting
with an overly broad brush, there are distinct advanatages to the newer
cards, and raw speed in all instances isn't exactly one of them [in the
mainstream market].


I think that's it. Or at least that's how what I'm going to believe! I guess
each new generation (GF3 GF4 FX etc) brings with it new cutting edge
features like Bump mapping anti arse, DX9 or whatever, which the older cards
don't have, but to make it a budget card, they cut back on memory and
transistors etc. So I guess if you're playing a game which doesn't support
these new features, then a top of the range old gen will probably beat a new
gen!

My TNT2u boasts detail textures and 32bit color on my PIII500, better than
the Vd3, but I never enable them, as it runs too slow, despite looking
better!

-=Matt=-





Look at it this way, it's not that the new cards are so bad, it's that the
older ones are so good.







  #12  
Old August 4th 03, 03:12 PM
A.C.
external usenet poster
 
Posts: n/a
Default

"Mark Leuck" wrote in message news:utmXa.60452$YN5.47334@sccrnsc01...
"A.C." wrote in message
My suspicion is that nVidia and ATI (who is also doing the same thing with
the 9500 and 9600) are doing it out of spite. Not sure why.


The 9600 was a refresh to add DX-9, it also has far fewer transistors than
the 9500 making it much cheaper to build which is why they came out with it


I think both companies are using the DX-9 angle to justify making
cards that are cheaper for them to manufacture. The problem is when
both of the companies do it, we suffer. I mean, LESS transistors?
That's almost insulting for some reason.

As far as Nvidia if I recall the original GF1 wasn't much faster than the
TNT Ultra, and the GF-2 was slightly slower in the beginning than the GF1.

I'm planning on getting an FX5600, despite it being occasionally
outperformed by the much cheaper Ti4200. But I'm gambling that when DX9
games come out my choice will be vindicated. But it probably won't. I

know
that logically I should go for a Ti4200, especially since I don't really
care about AA, but sometimes the marketing gets you even when you know
better...


At least get the ultra


Well, I figure the non-ultra is about $65-$100 cheaper, and it will
play the games I'm interested in (I'm not a hard-core FPSer, and I'm
more interested in visual effects than pure screaming speed).
  #13  
Old August 4th 03, 04:50 PM
John Russell
external usenet poster
 
Posts: n/a
Default


"-=Matt=-" wrote in message
...
In the 'old' days, when a new card came out you knew it would be better

than
the old one. Eg. Voodoo2 was better than V1, Geforce 2 better than

Geforce1
etc.

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,
which is why I got the GF4mx on ebay - it was so cheap! I've not recieved

it
yet, but as the GF3 is 2 generations lower than the FX; am I going to hear
that there is a faster version of a GF2 out there than my new GF4mx???

Why release these cards if they are inferior to existing models?

--
-=Matt=-

-------------------------------------------------------------
Matts Portrait and Star Wars Page
www.jed1kn1ght.fsnet.co.uk
-------------------------------------------------------------



I think the problems is that they don't design the cheapest product in a new
family first. They produce the new flagship model and everyone agrees it's
the faster than the best card in the old family. They then set about
producing cheaper versions to create a new family of products. This usally
means halfing pipelines etc. They cannot guaretee that this process will
make the cheaper cards faster than some of the old ones.
The simple answer is to not keep upgrading cards by buying the cheapest card
in every family. It's better to stick to a restricted budget by waiting
until the flagship model comes down in price. Being 6-12 months behind is no
great loss since little software will be available initially to exploit any
new card.


  #14  
Old August 4th 03, 08:16 PM
Dave
external usenet poster
 
Posts: n/a
Default


"Derek Wildstar" wrote in message
news:cvlXa.59006$uu5.6346@sccrnsc04...

"Dave" wrote in message
news:S4iXa.41432$Oz4.11794@rwcrnsc54...

I beg to differ. They are the ones who help push the envelope, as you

say.
This is Good for business. Helps drive the gaming industry too,
hand-in-hand. We'll see when Doom 3 comes out which cards leave the

shelves
fastest (most of us I'd imagine are already all set. Last I looked, I
haven't jeopardized ANYONE's bottom line! :^) ). Any jeopardy might come

in
the guise of a generous RMA policy, for those who cannot accept personal
responsibility as part of a choice to tweak things...


I'm going to reinforce my point, that enthusiasts' aren't much for the
bottom line.


Hmm, originally you stated that they were a *jeopardy* to the bottom line.
THIS is what sounds fishy. That they *aren't much* for the bottom line is a
given. Given the miniscule ratio of enthusiasts to average joes, I would say
it's a write-off...

Firstly, look at the percentage of sales that the top of the
line cards have versus the rest of the market, it's miniscule and it's not
very profitable.


At current pricing structures I have a little tough time believing this ;-).
Maybe if the inventory is rotting on the shelves, perhaps...or being passed
up in favor of a currently better solution (ATI, perhaps?)...sure, I agree
with you completely percentage-wise...

In fact, the recent FX adventure was a very unprofitable
venture.


And the enthusiasts' market drove this to the brink? I'd say this can pretty
much be put down as a Nvidia f$ckup. Poor execution.

Huge losses for nvidia due to an aggressive push for the untested
smaller fab process. How do I know? On line press and company
communications.


Pretty much old news by now, I'd imagine...some of us knew it well in
advance of release. I certainly knew what to expect, and was scarcely
disappointed (I had already dumped my Nvidia stock...made a bit too...). It
made for wonderful satire while it lasted...

Only the enthusiasts cared about the FX5800, and only they
will even know it existed, the mainstream will never think of it again

after
this post.


Thank God! It deserves to get buried, swept under the rug with the dust
bunnies caught in its central HVAC-sized plenum, or left in the shed with
the rest of the gardening appliances. They gambled, they lost. The way
Nvidia handled the whole affair did the job for them on their own bottom
line...and rightfully so. The FX5800 was a joke. No wonder they lost on it.
They drove thee ol' wagon to market, señor, even if she were sheddin' parts
all thee way down thee cobblestone pike an' thee ol' donkey she died when
she got there. Now tell me, who forced Pedro into the wagon? Of course it
was solely Nvidia's decision to continue developing, promoting and retailing
this faux pas. Their loss was by their own hands. Think it would have been
much different if the donkey was stillborn and they skipped an iteration of
current tech, waiting until refinements to recycle it for another $400+
stretch? That they would even allow such a prospect to intrude upon their
bottom line thusly is likely the subject of several after-hours round-table
sessions among the shareholders committee. This is where this half-baked
theory of "Voodoo Economics" signs me right off. They almost pulled a 3dfx
on that one. Certain other decisions they've made have hurt their bottom
line as well. To say that the lunatic fringe market has precipitated this
state of affairs is a little like putting Descartes before the horse...it
conveniently factors any decision Nvidia made about how to *effectively*
cater to this minority market right out of the picture.

Now, what about the lesser cards? According to dr. watson and directx diag
reports submitted to microsoft and user submitted specs, the actual
percentage of PC's with top tier video cards is in the low single digits.


I'll accept that demographic, although as someone pointed out it's a little
tough to tell because the savvy enthusiast might not even bother with
submitting reports to Microsoft. Sure. Nothing new here. Most of the average
computer shoppers will get a fair-to-middlin' OEM card, absolutely. I even
get to replace a few of them in my travels...

Only when you include the ti4200's does that percentage increase into the
teens. Not a whole lot, when it's about 1 in 8 have a ti4200 or better.

And
that's the *savvy* user who is considered to have better hardware than
average, so the actual numbers are likely less.


Tough to tell...not really enough information...all this really says is how
many people with top-tier cards bother submitting reports and specs. If you
can extrapolate from this the layout of the entire market, more power to ya!
Myself, I'd wanna see cumulative averages of retail sales figures since gold
date (maybe even peaks around certain game releases), inventory manifests,
etc...I can tell you one thing: the majority of card installs I've done are
certainly midrange hardware (that's where things could be a little confusing
for some folks right now...). It's not the relative percentages that are in
question here, Derek. It's the assertion that (catering to) the enthusiasts'
market is BAD for the bottom line, lest we lose sight of the underlying
issue in this flurry of statistics ;^). Yesterday's top shelf becomes
tomorrow's bottom line practically annually, and it's a little hard to
imagine that even with the attendant price drops into the range of the
reasonable that anyone's really eatin' it here. Am I missing something?

Nvidia is a behemoth compared to ATI, and they make the bulk of their

money
from OEM sales of average-performing current hardware, not titillating the
hard core. Sad but true.


Not sad. Merely the way things are and have always been. But from here, the
suggestion that covering the hardcore market is a detriment to the bottom
line is a stretch. It's all in the execution. ATI won out on this round.
Nvidia's "woes" we can lay right at their own front door...

I'm not trying to play 'I have a corporate secret'
either, most of this info has made it's way to the public domain, as it
should, but for the most timely info, try some marketing sites like
npdtechworld.com. While it is a pay service, is certainly has value if you
have to make business decisions based on what's selling when.


Thank you for your kind suggestions...;-P. You know I'll rip you a new one
if you insist on further patronization ;-)...nothing personal of course, I
don't dislike anyone here at all, certainly not you at any rate, it's all
in good fun!




  #15  
Old August 5th 03, 12:30 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"Dave" wrote in message
news:wiyXa.65920$YN5.49523@sccrnsc01...


Hmm, originally you stated that they were a *jeopardy* to the bottom line.
THIS is what sounds fishy. That they *aren't much* for the bottom line is

a
given. Given the miniscule ratio of enthusiasts to average joes, I would

say
it's a write-off...


In order to further the discussion for the folks at home:

I'm not going to back-pedal from the jeopardy comment, I still believe that
catering to the enthusiast is a bad financial idea in theory and in
practice. However, while there have been notable self-destructs in the
vidcard world, Vodoo6000, FX5800, it's arguable that external forces were
more deterimental than the internal decisions to pursue the top-tier perhaps
unwisely. As much as I respect the 3dfx engineers, their input into the V6
and the FX was in fact, jeopardizing the bottom line, by pushing a product
that required an advanced fab process for it to be a marketing and
performing sucess. Compare it to the past practice of nvidia and their
incremental upgrade process, three design teams working off the the advances
of the others, *waiting* until tech caught up to them, rather than trying to
drive tech forward. The GeForces were predictable and robust performers, all
based on mainstream concepts.

As far as patronization goes, I wouldn't dream of it. Only those who are
used to that sort of thing infer that tone from a post.






  #16  
Old August 5th 03, 02:24 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Derek Wildstar" wrote in message
news:20CXa.45514$Oz4.12441@rwcrnsc54...

"Dave" wrote in message
news:wiyXa.65920$YN5.49523@sccrnsc01...


Hmm, originally you stated that they were a *jeopardy* to the bottom

line.
THIS is what sounds fishy. That they *aren't much* for the bottom line

is
a
given. Given the miniscule ratio of enthusiasts to average joes, I would

say
it's a write-off...


In order to further the discussion for the folks at home:

I'm not going to back-pedal from the jeopardy comment, I still believe

that
catering to the enthusiast is a bad financial idea in theory and in
practice.


Doesn't nececelery have to be. I feel that covering all strata of the market
can still be done in profitable fashion without dumping or a loss leader.
YMMV. This is what binning is all about. Of course, this is why we have so
much of a mess in the midrange niche: today's mid-binned chips are not so
much more powerful than yesterday's top dogs in current games, and the
pricing structure does little to encourage upgrading. Consider the 5600
series vs. the 4x00's and you'll see what I mean.

However, while there have been notable self-destructs in the
vidcard world, Vodoo6000, FX5800, it's arguable that external forces were
more deterimental than the internal decisions to pursue the top-tier

perhaps
unwisely.


Very much so...and the responsibility for that choice, the implementation
thereof, and the repercussions both good and bad still fall upon who? 'Nuff
said?

As much as I respect the 3dfx engineers, their input into the V6


I think that it might have been putting Rampage on the back burner that
drove the biggest nail into the coffin of 3dfx, that and
mismanagement...perhaps "lack of timely input" is more to the point. But
that is hardly their "fault" per se: somehow I suspect that it's only
humanly possible to do just so much within a release timetable with
available resources and interrupted focus. The magic hats were shipped
without rabbits...

and the FX was in fact, jeopardizing the bottom line, by pushing a product
that required an advanced fab process for it to be a marketing and
performing sucess.


And not necessarily the enthusiasts, n'est çe pas? Well, I suppose there's
no substitute for design execution. That speaks for itself. Of course trying
to spooge the adoring public with that Rush-job of a leafblower didn't win
them ANY brownie points. But all these were hardly decisions of the
engineering dept. so much as afterthoughts, these guys still have to answer
to Management (sadly enough, and sometimes it seems ne'er the twain shall
meet...) and deadlines ("Well, MAKE IT WORK, dammit! On schedule! Never mind
how! That's YOUR job, bucko!"). I suppose the 5800U could be deemed the
"Daikatana" of video cards...

Compare it to the past practice of nvidia and their
incremental upgrade process, three design teams working off the the

advances
of the others, *waiting* until tech caught up to them, rather than trying

to
drive tech forward. The GeForces were predictable and robust performers,

all
based on mainstream concepts.


And there was not as much drive to innovate. Those days the competition
didn't really have a leg-up...wait, there WAS no competition. I think Nvidia
got too comfortable at the top and slipped in their game a little...now they
want their trophy back by hook or by crook. Those 44.03 drivers are
certainly an example of the latter...hardly the only one at that...now maybe
being pilloried a little here and there will force Nvidia to clean up their
act a tad? I'm gonna go out on a little limb here and suggest that the
consumer could be the best QA of all. A product that stands head and
shoulders above the rest (such as Nvidia's contenders in the Socket A
chipset market) could practically sell itself on its own merits. The Voodoo
1 certainly did; so did the V2 for awhile. As did later GeForces.
Unfortunately, that is not the case here. If Nvidia has to "go to the
mattresses" and rely on OEM accounts to maintain profitability in the face
of loss on the high end, they did it to themselves with a sabot slug to the
metatarsals, and they get zero sympathy here...as for the incremential
upgrade process, I think we can see that pretty clearly in several midrange
and low-end examples compared to last-gen tech in the same price range...

We won't get into Nvidia's erstwhile relationship with M$, nor ATI's recent
hand in M$' API development. That gets saved for a rainier day...

As far as patronization goes, I wouldn't dream of it. Only those who are
used to that sort of thing infer that tone from a post.


Touché! Ah, a wit I can appreciate...;-)


  #17  
Old August 5th 03, 04:07 AM
mmartins
external usenet poster
 
Posts: n/a
Default

I don't understand how people can say, as I've noticed they do on this
newsgroup, that some versions of the GF3 are faster than some FX's? Is it
indeed true?!? I have also heard that the GF3 is better than the GF4mx,


I'm amazed by the high quality and low cost of the GF3 card

8 player WC3 games, 64 player BF1942 sessions : with every game I try,
performance is always up near 100fps

And it only costs $40 on ebay...




  #18  
Old August 5th 03, 07:47 AM
Dave
external usenet poster
 
Posts: n/a
Default


"Derek Wildstar" wrote in message
news:18FXa.46704$cF.17654@rwcrnsc53...

"Dave" wrote in message


And not necessarily the enthusiasts, n'est çe pas? Well, I suppose

there's
no substitute for design execution. That speaks for itself. Of course

trying
to spooge the adoring public with that Rush-job of a leafblower didn't

win
them ANY brownie points. But all these were hardly decisions of the
engineering dept. so much as afterthoughts, these guys still have to

answer
to Management (sadly enough, and sometimes it seems ne'er the twain

shall
meet...) and deadlines ("Well, MAKE IT WORK, dammit! On schedule! Never

mind
how! That's YOUR job, bucko!"). I suppose the 5800U could be deemed the
"Daikatana" of video cards...



I just wanted to make sure I wasn't interpreting this paragraph

incorrectly,
believe me, internet communications are as unreliable a product as is
capacitor electrolyte brewed in the small but growing chinse village of
phukt-up. But are you trying to tell me that you think that thermal
management was an 'afterthought' of chip design? Or that the engineers

have
less than a penultimate say in design?


Within limits, of course. I think the 5800U cooling solution was a rush-job
at any rate. I think after a brief glimpse at the thermal output vs. scaling
curve of the new design on said immature process, the local pharmacy might
have run short of Tums in no time...you trying to tell me they PLANNED it
like this from the beginning? I think they were working with a series of
uncertainties and Murphy (the veritable patron saint of all engineers,
technicians, spouses, and parents) sprang up, said "Gotcha!", and bit 'em
right in the ass! From there, it went to "Oboy, better strap on the
leafblower and blow this thing out the door with as much hype as we can
muster so we don't look like we're slipping on product cycle" instead of
"Let's clock this thing lower and price it accordingly so we don't look like
a bunch of jabronis". This is where the afterthought came in. I hope this
helps the pH of that electrolyte a little. I think the engineers may have
been involved in such a capacity as answering the question: "Can we get away
with it?" (of course it didn't work right when it got out the door. Among
the horror stories, there was the one about the fan not cycling up during 3d
screensavers and burning up the chip...oops!). As we can see, with a bit of
process refinement, this is no longer a necessity and the reference heatsink
is only half a flaming abortion even if still a heavy, slot-hogging
paperweight. As things ideally should have been from the beginning.

I'm not debating the heiracrchy of
who gets to say what to whom, but let's try these two scenarios, and you
pick for me the one that works best, ultimately, for the continued sucess

of
the company.



1) Let's go match our new, untested design by our recently acquired
engineers we perviosuly bankrupted on a new, untested, unfinished fab
process with a new underperforming thermal management system in order to
capture the 'speed crown', at great expense and tremenous tangible risk.


And this is precisely what happened. "Why" is secondary to the fact it
happened at all. But I just won't buy into "The gamers made us do it!".
I'll take that "perviousuly" as a Freudian slip, not a typo. Certainly lends
an accurate twist to the whole sordid affair. ;-)

Or

2) Let's take our established product line with our proven design teams,
continue in our corporate theme of introduction, speed refresh *as tech
allows*, and while not be the 'fastest', make sure that our kind of fast

is
*fast enough*.


Would have been a surer approach. So might have been waiting until process
was more refined to release their flagship line, and riding out their
existing market share (namely OEMs and the sub-$150 cards that comprise the
majority of retail sales) in the meantime. Or positioning lower-clocked,
full-featured FXes against ATI midrange cards, pricing the Ti 4x00's to
destroy everything under ATI's 9500 series while tweaking process and
ramping up clock speeds. Their midrange product line is 0wn3d by ATI
performance-wise if not in overall sales (be interesting to compare %
shipped to % sold sometime). I just couldn't see any compelling reason to
pick a FX 5600 over a 9500 NP for the same price (especially using the 44.03
drivers with their "disappearing 8x AA on UT" trick that are now offered as
a Windows Update-ha!), or the 5600U when the 9700 NP is just $40 more.
Others may prioritize brand loyalty over value, I just want what works best
within its price range, whatever my budget may be. And I know I'm hardly
alone.

But we can't really say what works best, for the sake of revisionism. What
happened happened. And a number of other factors that made it happen we are
not entirely privy to and can only speculate with varying degrees of
education...as far as affecting Nvidia's continued success, it may have been
a speedbump in the road, but it hardly put them out of the running. Any way
it would have happened, they were up to their ears in aforementioned
electrolyte---instead of getting a bigger shovel and methodically digging
themselves out, they've decided to fling it around a little bit...

Of course, it's a trick question, your viewpoint isn't listed!


Which is...? (not likely able to be summed up in a few sentences, that's for
sure!)

And, do not forget the SEC investigation which came *this* close to
destroying the entire business, which implicated more than one senior
engineer.


And a couple of management people who were caught insider trading before
this...

Can't work as well when you're thinking of losing everything and
going to jail, or you lose your job when all senior managment gets

indicted.
That's why they chose #1 out of desparation.


I'll buy that for a dollar. What amazes me is that they actually had the
cojones to release such a graceless kludge, even in such limited quantities.
I guess they singed their reputation a little in order to try and keep the
stockholders happy, eh? Nah, it was really those g4m3r d00dz and rabid
Nv1d10tz they had to hold onto. Yeah, that's the ticket. Mighty white of
'em...

I'm hoping they get back to
doing #2. And ignoring the screaming megahertz meemies.


I think that development and refinement of high-end chips is fairly
important to market longevity, don't you? Especially with stiff competition
and at today's turnover rates. Almost like the "publish-or-perish" status
surrounding academia. All that really happens is the ones that don't pass
muster for high-end cards get made into lower-end cards, and the majority
gets 'em. I'd be willing to bet one could take any given manufacturer's
entire FX product line and find chips from the same wafer assorted
thereabout, if that were possible to determine...



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Different graphics cards for my P4PE Mikael Asus Motherboards 11 February 5th 05 04:37 PM
GA 8KNXP don´t recognizes any pci cards Stefan Schneider Gigabyte Motherboards 1 November 30th 03 11:41 PM
Do current ATI cards support DVI at 1600 x 1200? J.Clarke Ati Videocards 1 October 26th 03 10:51 PM
Half-Life 2 - Nvidia cards can't do FSAA (?!?!) WTH Ati Videocards 27 July 27th 03 05:39 AM
ATI Cards with 32MB and above Ziki99 Ati Videocards 1 July 7th 03 09:43 PM


All times are GMT +1. The time now is 03:44 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.