View Single Post
  #1  
Old July 24th 05, 09:27 AM
Magnulus
external usenet poster
 
Posts: n/a
Default Image Quality, Optimizations, etc.

I've been noticing lately that in games there's alot of moire on
textures, specificly around the areas that mipmaps would be for bilinear
filtering, when using trilinear filtering + anisotropic filtering with
GeForce 6600 cards (and also GeForce 6800). Even when I set the image
quality to "high quality" and disable all optimizations.

I did a search and apparrently alot of other folks on forums are having
issues too, but curiously enough, none of the major review sites seem to be
paying any attention to this IQ (image quality) issue. Some ordinary
forumers are speculating perhaps there is junk left over from the GeForce FX
days. In the last year or two, both ATI and NVidia have become
increasingly aggressive with the use of optimizations in an attempt to
one-up their competitor. I can't help but wonder how much of the
"performance" of these newer cards is simply due to cheating and shortcuts.
Others think maybe Nvidia is not using true anisotropic filtering at all
anymore, but some other method, to perhaps gain speed- however, obviously
there are IQ issues they are ignoring.

Now, some folks and ATI/NVidia claim these optimizations have little or no
IQ affects. Well, you'ld have to be blind to not spot the moire in many
games when using anisotropic filtering. You can clearly see it on
repetitive patterns such as grating, floor tiles, roads, and those sort of
textures that have alot of repetitive, fine detail. Look at levels in UT
2004 like asbestos or Oceanic, you can clearly seet it on the floors. I can
also spot it in Grand Theft Auto: San Andreas and several other games (I
don't have may games installed on my PC currently, so it's a small sample).