Thread: Picture quality
View Single Post
  #4  
Old November 29th 03, 04:30 AM
phobos
external usenet poster
 
Posts: n/a
Default

Aki Peltola wrote:
Using Nvidia's 44.09-detonators.
Am I blind or howcome I can't see any differences in
picture quality between "High Performance" and "Quality"-
settings? I even took few screen captures from UT2003
using both settings and spent a moment comparing the
pictures. After that I can't understand why those settings
even exist? Supposely using "Quality" only slows down
fps without any visually noticeable reason.

Omega-drivers are perhaps a whole different story
(haven't tried yet).



The performance settings are mostly useful for Anisotropic filtering and
FSAA. In regular games with normal bi/trilinear filtering and no AA,
they don't make a big difference. They affect shader features under
DirectX also (under high performance mode, reflections look more dull in
DX8).

They also do things like force certain optimizations in games. UT2003
normally has it's own LOD bias and trilinear filtering settings, but you
can override them at the expense of a bit of aliasing at distances.

The Omega drivers are just lumped together various versions of the main
driver files. Some are from this set, others from earlier one, etc.
They mainly screw with the LOD bias and texture sharpening options which
can normally be enabled in the registry or with a tweaker program. I
don't recommend them unless you're gullible.