A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

What recourse do NVidia owners have?



 
 
Thread Tools Display Modes
  #11  
Old September 18th 03, 01:44 AM
Dark Avenger
external usenet poster
 
Posts: n/a
Default

"magnulus" wrote in message ...
OK, I can't help but feeling angry about the whole DX9 crap with NVidia.
But what does the law say on this issue? You know, if the GeForce FX 5900
were just a LITTLE slower than ATI, I could live with that. I like NVidia,
they are an American technology company, they make good drivers (at least in
the past), and they have helped push gaming forward. But their present
behavior is nothing short of devious, sneaky, and dishonest.

Facts:

1) NVidia knew the DX 9 specification for quite a while, yet they turned out
the GeForce FX cards, which so far.... underpeforming is too kind a word.
It cannot run DX 9 games acceptably. 30 FPS (average, it could fall in many
places) in Half Life 2 on a hotrod system, is simply unacceptable. By no
stretch of the imagination can I see this problem ever being fixed. The FX
cards simply have a broken architecture. Where did those millions of
transistors go? Are they just spinning their wheels in DX9?

2) NVidia worked with Valve presumably for a long time, and they knew the
GeForce FX was going to have problems with it. Yet they kept the industry
in the dark about these problems and hid the fact that the FX is a DX 9
product in marketting spin only.

3) Smaller developers will not be able to make NVidia specific code and hold
down developement time and costs, yet there are now thousands of FX cards
they have to support. Smaller developers are the lifeblood of PC gaming, if
they can't get their foot in the door, PC gaming is doomed. And I cannot
see NVidia simply helping smaller developers out of charity.

4) DX 9 benchmarks across the board are showing the FX cards
underperforming, so much so that a card that costs half as much is beating
it consistently. This is not a game specific problem, it's much deeper than
anything a patch will fix.

So what can NVidia owners do? Is there enough grounding for a
class-action lawsuit?


I rather see the money they earned to into making a REAL DX9 card then
that it goes into mad costumers. Ofcourse you guys got ripped off,
ofcourse the card promised DX9 yet it's performance is way leaking.

To bad for though that...nvidia actually is giving you DX9...only
ultimate slowly! Who said a DX9 card actually needed to run DX9 fast?

To bad....

Nvidia is covered.... ofcourse..that they are cheating and that they
are forcing benchmarkers to keep working on nvidia cheat detectors.
Ofcourse this causes the software houses to have to add FX support
because DX9 is so damn slow on FX cards.

Sorry kiddo...no hard case here.....

Ofcourse...this...problem around the FX has caused already troubles
for nvidia....now children...lets just wait on what the nv40 has to
offer us, 2x8 shaders they claim...yummie!
  #12  
Old September 18th 03, 01:47 AM
ho alexandre
external usenet poster
 
Posts: n/a
Default

Dam6 wrote:
If
Nvidia knew that their cards would not function properly within the DX9
architecture, they should have done more.


Well, nVidia say they are at the moment. Isn't that fantastic ?



Open GL, umm? does anyone use it?


Well, nearly all 3D games except Morrowind, HL2, Eidos games (Tomb
Raider & Hitman) and MS games (Halo & SplinterCell) ?
that is : all games on Quake2/3 engine (Half-Life1, Soldier of Fortune2,
Medal of Honor), Serious Sam, UT2003, Doom3, Tribes3 are just recent
examples.



--
XandreX
/I'm that kind of people your parents warned you about/

  #13  
Old September 18th 03, 01:48 AM
ho alexandre
external usenet poster
 
Posts: n/a
Default

Passion_Pilot wrote:
Where did those millions of transistors go?
Are they just spinning their wheels in DX9?


Do you know what a transistor is? How about what it does? See above
question.


you're being mean



--
XandreX
/I'm that kind of people your parents warned you about/

  #14  
Old September 18th 03, 01:49 AM
ho alexandre
external usenet poster
 
Posts: n/a
Default

tHatDudeUK wrote:
The law says goods must be suitable for their use and as described. Nvidia
may have described their card as DX9 compatible rather than as supporting
all DX9 features.


Indeed !!! No card is said to be supporting all DX9 features !!!

--
XandreX
/I'm that kind of people your parents warned you about/

  #15  
Old September 18th 03, 02:02 AM
ho alexandre
external usenet poster
 
Posts: n/a
Default

Dark Avenger wrote:
Ofcourse this causes the software houses to have to add FX support
because DX9 is so damn slow on FX cards.


Well, if that happens, nVidia wins it all, and nVidia owners too. Many
people say Valvestate wants their HL2 to perform well both on ATI and
nVidia. case is almost closed for ATI, so nVidia remains. Two
possibilities :
- the new detonators will improve drastically before they're released
(seems unlikely to what I've understood after reading millions of
posts here).
- Valvestate codes so as to make the application run fast & nice with
FX (possible, since Valve has much more to lose than nVidia if HL2
runs poorly on FXs).

--
XandreX
/I'm that kind of people your parents warned you about/

  #16  
Old September 18th 03, 02:32 AM
RoadCzar
external usenet poster
 
Posts: n/a
Default

http://english.bonusweb.cz/interviews/carmackgfx.html



"No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion,
are these results representative for future DX9 games (including Doom III)
or is it just a special case of HL2 code preferring ATI features, as NVIDIA
suggests?

Unfortunately, it will probably be representative of most DX9 games. Doom
has a custom back end that uses the lower precisions on the GF-FX, but when
you run it with standard fragment programs just like ATI, it is a lot
slower. The precision doesn't really matter to Doom, but that won't be a
reasonable option in future games designed around DX9 level hardware as a
minimum spec.


John Carmack"



"Mark Nichols" wrote in message
...
.....
Ummm, yeah who uses OpenGL, well actually ID does... Doom III will use it,
and it won't even touch D3D9...

------
Mark




  #17  
Old September 18th 03, 02:42 AM
magnulus
external usenet poster
 
Posts: n/a
Default

I don't understand the whole deal about partial vs. full precision. I
thought that was part of the DX 9 spec. So why is Valve making a big deal
about it?

FWIW, I had a Radeon 9700 Pro way back. I liked it but the drivers at the
time sucked, so I gave it away. From what I read on it, it was suppossed to
have a partial precision mode, but Valve claims they run it at full
precision. Maybe I am mistaken, though.

So what's the big deal with Valve? Surely not every effect in the game
needs full precision. It couldn't possibly be that hard to code some
effects partial precision, and some full precision. And doesn't it seem
like they are going overboard with shaders too quickly? What ever happened
to traditional effects like gloss maps, bump maps, etc. Why does everything
have to be a shader? It seems like this whole thing is a revolution rather
than an evolution.

I think the whole industry needs to answer these questions, not just
NVidia.


  #18  
Old September 18th 03, 04:15 AM
McGrandpa
external usenet poster
 
Posts: n/a
Default

"John Lewis" wrote in message

On Wed, 17 Sep 2003 14:03:19 -0400, "chainbreaker"
wrote:

magnulus wrote:

So what can NVidia owners do? Is there enough grounding for a
class-action lawsuit?


IMO, it's a "tough-****" deal. And I just bought an FX5900 myself.

Only thing I intend to do is exercise the old saw, "fool me once . .
."


I also bought a FX5900 ( for $250), well aware of the DX9 discussion
and am very happy with my purchase. Runs everything that I have tried
in my huge collection of DirectX games with nary a problem (45.23
driver). For example, worst-case Morrowind FPS has shot up from 18 to
30 with all features maxed.........


I got my 5900-128 two months ago. $399, and the 9800-128 was the same
price.

I LOVE the water in Morrowind now


I do not buy video cards for bragging rights. I buy them for
my professional work, mostly video editing and for trouble-free
gaming when I have time. Fiddling with benchmarks, driver
settings and driver-versions is not my idea of gaming-fun
at all !! Life is too short..........

nVidia has been stellar in the backward (and forward)
compatibility of their release-drivers ( except for 44.03, which
did have some problems in WinXP ).


Hm. I'm using the 44.03 drivers and XP Pro. What problems am I to be
looking for?


If necessary, I am quite happy to tone down water-effects
and other peripheral gloss in a DX9 game like HL2.
The graphics of HL2 elsewhere are far, far better than HL1
but still are are not anywhere near-realistic, so any such
toning- down will have a negligible effect on my
"immersive experience". FPS games on PCs with genuinely-
realistic graphics and first-class AI are going to have to await
a few more generations of CPUs, GPUs and buses, plus will
consume vast hard-disk space and RAM..

Anyway, HL2 will only be available November 19. Time enough
for another round in the video-card wars........... Also HL2
uses Steam's on-line copy-protection after any downloaded
bug-fixes of the retail-purchased game, which is a total disaster for
single-play and local-LAN.... see the relevant threads for the
details.

I personally have no intention of parting with my hard-earned
money on a retail product flawed by Valve's currently half-baked
pay-as-you-play distribution tool........... A decision which
will be reconsidered should Valve appropriatly respond
before HL2 release to all the major customer-concerns
about Steam..

John Lewis



I'll probably get taken to the cleaners again somewhere down the
road. But it won't be nVidia doing the taking.
--
chainbreaker

If you need to email, then chainbreaker (naturally) at comcast dot
net--that's "net" not "com"--should do it.




  #19  
Old September 18th 03, 04:20 AM
i'm_tired
external usenet poster
 
Posts: n/a
Default

ho alexandre wrote:
Passion_Pilot wrote:
Where did those millions of transistors go?
Are they just spinning their wheels in DX9?


Do you know what a transistor is? How about what it does? See above
question.


you're being mean


Certainly not trying to be mean. When someone with the proper experience,
education, and first-hand knowledge begins asking such questions or making
such claims, I'll pay attention (and some respectable people in the industry
have indeed made some of the same commentary for many reasons). But, when a
fellow who read a few web pages and learned how to parrot 3 or 4 posters
here and on web forums makes claims he doesn't have the technical expertise
to even own his own opinion about, I must question it. - - And the same goes
to the bizarre claims about game programming that the OP made. Anyone with
even a semester of Java or VB or C+ knows about modules and understands that
certain packages of code can be created for the intention of re-use (or to
be called to use) under whatever circumstances deem it necessary.

rant
nVidia has a long driver track record and nearly all of it is good up until
the 3DMark fiasco. Should their driver development team bve totally
discounted as inept just because of the latest few sets of drivers being
questioned by people with clear monitary stakes in the CG industry? Well,
maybe, but maybe not. Some people who have had critical opinions are legit,
but others might well be speaking out of greed (do you suppose futuremark
was happy that nVidia wouldn't pay the hundreds of thousands of dollars to
participate in the futuremark development consortium?).

I'm still wondering where the Ansio and AA hype came from LOL. 9 out of 10
people can't tell the difference between AA on or off from a screenshot of
most games (granted there are many areas of many maps of many scenes of many
games where jaggies appear if one really looks for them... yadda yadda
yadda) let alone while playing a game. Sure thing: ATI seems to be doing a
really good job of making cards that are powerful enough to see little
effect from AA and Ansio. Sure thing, I read the same 5 or 6 guys over and
over and over and over in this NG saying that they use AA and Ansio and that
it is so much better and that they would never even try to play a game with
out it (and a bunch of other total BS). - - However, do you really think
that the FX-5900 Ultra (with whatever driver release) won't be able to play
HL2? Or Doom3? Even if it is slower with AA and Ansistropic filtering and
any other goodie turned on, so what? Will it be unplayable? Hardly.

No one cared about AA when the geForce 2 was killing the first Radeons. No
one cared about AA when the GeForce 3 was outperforming the Radeon 7500 and
8500. No one cared about AA when the GeForce 4 was really smoking the
Radeon anything. Then the 9700 pro came along and suddenly AA was
everything, even though most people don't use it (and beyond that, most
users out there don't know what it is let alone how to turn it on and off).
What exactly is up with that? ATI has always done a good job of image
quality both 2D and 3D. Maybe nVidia could have done a better 2D job with a
couple of cards but other than the recent questions about reduced image
quality from a couple of driver "optimizations", I've never heard anyone
crying that they thought they had inferior 3D with their nVidia products.
Hell, if anyone really wants to compare 2D image quality, all should bow
down to Matrox. In fact, even though it seems Matrox has all but abandoned
the 3D market, ask anyone who bought that 3xRAMDAC Parhelia what they think
of their 3D image quality. You'll find that even though they can't have
massive frame rates, they are all more than pleased with image quality. I
don't think it really has to do with that in the end. Kudos to ATI for
making their first truly excellent products. But, does that mean that
nVidia should be sued? Does that mean that web forums and usenet news
groups should be disrupted by ATI fanboys and ATI employees with all this
trolling? When are the users of this particular NG going to start using
their kill filters on those who keep discussion away from the actual topic
here? If you looked for this NG because of a technical problem with your
nVidia product and found all of these threads where no actual discussion of
nVidia video cards is going on.......... well, what would you do? It has
become nearly impossible to get good info here.
/rant


  #20  
Old September 18th 03, 04:30 AM
i'm_tired
external usenet poster
 
Posts: n/a
Default

magnulus wrote:
snip
I think the whole industry needs to answer these questions, not just
NVidia.


You said a true mouthful there. Amen.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 01:04 AM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Ati Videocards 12 February 24th 04 06:29 AM
Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered... Dave Nvidia Videocards 28 September 14th 03 05:51 PM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Nvidia Videocards 19 August 14th 03 09:46 PM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Ati Videocards 12 August 13th 03 09:19 PM


All times are GMT +1. The time now is 10:53 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.