View Single Post
  #6  
Old September 17th 03, 12:27 AM
magnulus
external usenet poster
 
Posts: n/a
Default

The only people that have been screwing us, are NVidia. Sad, because I
really like NVidia, and I'd be reluctant to go back to an ATI card. NVidia
knew their hardware couldn't run full precision, 32-bit shaders, but they
implemented them anyways. ATI on the other hand runs at 24-bit and you
can't see the difference, they are still better than NVidia's
implementation. If NVidia could run the shaders at 24-bit, it would be a
non-issue, IMO.

Just an example... try running Dusk Ultra or Last Chance Gas. Notice the low
framerate. Now imagine making a game with 4-5 characetrs and a whole world
that looks like that. Oh, and turn on anisotropic fiiltering and watch the
framerate absolutely die.

And Gabe Newell is right... Valve can afford to implement 2 seperate
codepaths and create a seperate set of optimized shaders. But many smaller
developers, the bread and butter of computer gaming, just aren't going to
have the time or money. Thank you, NVidia. Guess if a developer doesn't
have millions and doesn't participate in your PR campaign, they are chopped
liver.

NVidia has exactly 14 days until Halo comes out, then they are ****ed if
they don't have a solution. I for one won't play Halo, Half Life 2, or Deus
Ex in anything but their full dynamic range. I've seen the difference in
the screenshots, and it makes me a believer. The difference is greater than
going from DX 7 to DX 8, it's like a whole new world of realism- forget
dynamic shadows, bump mapping, specular highlights, nothing beats a clean,
pure image free of the muddy colors so common to computer games.

I for one am tired of buying a new videocard every year and paying 300
bucks for it. Even if I sold off my GeForce FX, it probably wouldn't even
make enough to buy a Radeon 9600.