View Single Post
  #4  
Old July 25th 05, 12:08 AM
Magnulus
external usenet poster
 
Posts: n/a
Default

I think it might be application specific, I don't know (UT 2004). I
have a fresh install of Windows XP 64 Pro. I downloaded a 64-bit compatible
version of rivatuner and set the mipmap LOD bias to 0 in both cases. It
seemed to help a little, but the effect is still there. I also installed
Serious Sam, and while this game looks much better in terms of texture
quality, you can still see some "texture aliasing" on some of the walls that
have horizontal or vertical features (relative to the texture, not the
camera). Increasing mipmap LOD bias via Rivatuner defeats this, but also
causes a little texture blurriness everywhere else (partially fixed by
anisotropic filtering), and it also cause the text in UT 2004 in the GUI to
go blurry.

Doing more reading/research, I came across an article on the "shortcuts"
both ATI and NVidia are using to eek out every last bit of speed. For
instance, in texture blending ATI uses only 5 bits per sample in Direct3D.
This is the Direct3D default rasterizer's recommended limit, but using more
bits (6) would help in blending operations in terms of quality, though of
course it would be a little slower. NVidia may do something similar, after
all, in the GeForce 6XXX series of cards they imitated ATI and went with
isotropic/brilinear filtering, rather than mathematically precise trilinear
filtering. Check out this website to get a good idea of what I'm talking
about: http://www.3dcenter.org/artikel/2003..._b_english.php Banding
artifacts/moire are a good description of what I'm seeing.

Another possibility is that this stuff is not visible at all on a regular
monitor- perhaps they are just too blurry. An LCD monitor has a fixed
aspect ratio, has no inherent moire, and so on. Perhaps this stuff has been
there all along and nobody has really payed attention to it. It's
definitely a subtle effect and if you are busy fragging you probably won't
notice it.

It's interesting that ATI and NVidia are both pushing SLI/Crossfire cards
for their many image quality improvements. One of the improvements is
"texture quality", they often cite. Ie, reducing crawling textures. Well,
it would make more sense to me, rather than using a supersample
anti-aliasing mode and 2 video cards, to just "get it right", nip it in the
bud at the texture mapping and filtering stages rather than when the scene
is being rendered.