A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Image Quality, Optimizations, etc.



 
 
Thread Tools Display Modes
  #1  
Old July 24th 05, 09:27 AM
Magnulus
external usenet poster
 
Posts: n/a
Default Image Quality, Optimizations, etc.

I've been noticing lately that in games there's alot of moire on
textures, specificly around the areas that mipmaps would be for bilinear
filtering, when using trilinear filtering + anisotropic filtering with
GeForce 6600 cards (and also GeForce 6800). Even when I set the image
quality to "high quality" and disable all optimizations.

I did a search and apparrently alot of other folks on forums are having
issues too, but curiously enough, none of the major review sites seem to be
paying any attention to this IQ (image quality) issue. Some ordinary
forumers are speculating perhaps there is junk left over from the GeForce FX
days. In the last year or two, both ATI and NVidia have become
increasingly aggressive with the use of optimizations in an attempt to
one-up their competitor. I can't help but wonder how much of the
"performance" of these newer cards is simply due to cheating and shortcuts.
Others think maybe Nvidia is not using true anisotropic filtering at all
anymore, but some other method, to perhaps gain speed- however, obviously
there are IQ issues they are ignoring.

Now, some folks and ATI/NVidia claim these optimizations have little or no
IQ affects. Well, you'ld have to be blind to not spot the moire in many
games when using anisotropic filtering. You can clearly see it on
repetitive patterns such as grating, floor tiles, roads, and those sort of
textures that have alot of repetitive, fine detail. Look at levels in UT
2004 like asbestos or Oceanic, you can clearly seet it on the floors. I can
also spot it in Grand Theft Auto: San Andreas and several other games (I
don't have may games installed on my PC currently, so it's a small sample).


  #2  
Old July 24th 05, 10:00 AM
Einstine
external usenet poster
 
Posts: n/a
Default

"Magnulus" wrote in message . ..
I've been noticing lately that in games there's alot of moire on textures, specificly around the areas that mipmaps would be for
bilinear filtering, when using trilinear filtering + anisotropic filtering with GeForce 6600 cards (and also GeForce 6800). Even
when I set the image quality to "high quality" and disable all optimizations.

I did a search and apparrently alot of other folks on forums are having issues too, but curiously enough, none of the major
review sites seem to be paying any attention to this IQ (image quality) issue. Some ordinary forumers are speculating perhaps
there is junk left over from the GeForce FX days. In the last year or two, both ATI and NVidia have become increasingly
aggressive with the use of optimizations in an attempt to one-up their competitor. I can't help but wonder how much of the
"performance" of these newer cards is simply due to cheating and shortcuts. Others think maybe Nvidia is not using true
anisotropic filtering at all anymore, but some other method, to perhaps gain speed- however, obviously there are IQ issues they
are ignoring.

Now, some folks and ATI/NVidia claim these optimizations have little or no IQ affects. Well, you'ld have to be blind to not spot
the moire in many games when using anisotropic filtering. You can clearly see it on repetitive patterns such as grating, floor
tiles, roads, and those sort of textures that have alot of repetitive, fine detail. Look at levels in UT 2004 like asbestos or
Oceanic, you can clearly seet it on the floors. I can also spot it in Grand Theft Auto: San Andreas and several other games (I
don't have may games installed on my PC currently, so it's a small sample).



I don't notice much as far as details except that I have been put off on buying
Battlefield 2 because it seems my eyes can never focus. I cannot explain it
much better than that. There is a lot of great stuff going on and buildings
to hide behind but it seems it is all a blur.

ATI 9800 Pro. 20/20 Vision.


  #3  
Old July 24th 05, 11:22 PM
deimos
external usenet poster
 
Posts: n/a
Default

Magnulus wrote:
I've been noticing lately that in games there's alot of moire on
textures, specificly around the areas that mipmaps would be for bilinear
filtering, when using trilinear filtering + anisotropic filtering with
GeForce 6600 cards (and also GeForce 6800). Even when I set the image
quality to "high quality" and disable all optimizations.

I did a search and apparrently alot of other folks on forums are having
issues too, but curiously enough, none of the major review sites seem to be
paying any attention to this IQ (image quality) issue. Some ordinary
forumers are speculating perhaps there is junk left over from the GeForce FX
days. In the last year or two, both ATI and NVidia have become
increasingly aggressive with the use of optimizations in an attempt to
one-up their competitor. I can't help but wonder how much of the
"performance" of these newer cards is simply due to cheating and shortcuts.
Others think maybe Nvidia is not using true anisotropic filtering at all
anymore, but some other method, to perhaps gain speed- however, obviously
there are IQ issues they are ignoring.

Now, some folks and ATI/NVidia claim these optimizations have little or no
IQ affects. Well, you'ld have to be blind to not spot the moire in many
games when using anisotropic filtering. You can clearly see it on
repetitive patterns such as grating, floor tiles, roads, and those sort of
textures that have alot of repetitive, fine detail. Look at levels in UT
2004 like asbestos or Oceanic, you can clearly seet it on the floors. I can
also spot it in Grand Theft Auto: San Andreas and several other games (I
don't have may games installed on my PC currently, so it's a small sample).



Your registry settings might be whacked. Make certain you're using the
newest drivers (77.76), then with coolbits on, click Restore under
Performance and Settings. This should set the default for all the
control panel settings.

Normally the registry settings should be cleared by the installer, but
if you upgrade overtop another driver they get left behind and values
change between versions.

Additionally, I've noticed that only in certain games, the driver seems
to be forcing a specific level of anisotropic filtering and
optimizations (user mips, af ops, stage ops). I think these are
application specific as they seem to change and get a little better with
successive versions. Many people complained about "shimmering" (like
you're describing) on mipmaps in Painkiller and other games for the
longest time. I've seen it in World of Warcraft and Battlefield 1942
and Doom 3.

Even with optimizations off, some games exhibit almost imperceptible
banding from "brilinear" filtering (at least on my FX5700 card). I think
some of this is hard coded. Take a look at the Doom 3 profile for
example, it won't let you touch the AF settings.
  #4  
Old July 25th 05, 12:08 AM
Magnulus
external usenet poster
 
Posts: n/a
Default

I think it might be application specific, I don't know (UT 2004). I
have a fresh install of Windows XP 64 Pro. I downloaded a 64-bit compatible
version of rivatuner and set the mipmap LOD bias to 0 in both cases. It
seemed to help a little, but the effect is still there. I also installed
Serious Sam, and while this game looks much better in terms of texture
quality, you can still see some "texture aliasing" on some of the walls that
have horizontal or vertical features (relative to the texture, not the
camera). Increasing mipmap LOD bias via Rivatuner defeats this, but also
causes a little texture blurriness everywhere else (partially fixed by
anisotropic filtering), and it also cause the text in UT 2004 in the GUI to
go blurry.

Doing more reading/research, I came across an article on the "shortcuts"
both ATI and NVidia are using to eek out every last bit of speed. For
instance, in texture blending ATI uses only 5 bits per sample in Direct3D.
This is the Direct3D default rasterizer's recommended limit, but using more
bits (6) would help in blending operations in terms of quality, though of
course it would be a little slower. NVidia may do something similar, after
all, in the GeForce 6XXX series of cards they imitated ATI and went with
isotropic/brilinear filtering, rather than mathematically precise trilinear
filtering. Check out this website to get a good idea of what I'm talking
about: http://www.3dcenter.org/artikel/2003..._b_english.php Banding
artifacts/moire are a good description of what I'm seeing.

Another possibility is that this stuff is not visible at all on a regular
monitor- perhaps they are just too blurry. An LCD monitor has a fixed
aspect ratio, has no inherent moire, and so on. Perhaps this stuff has been
there all along and nobody has really payed attention to it. It's
definitely a subtle effect and if you are busy fragging you probably won't
notice it.

It's interesting that ATI and NVidia are both pushing SLI/Crossfire cards
for their many image quality improvements. One of the improvements is
"texture quality", they often cite. Ie, reducing crawling textures. Well,
it would make more sense to me, rather than using a supersample
anti-aliasing mode and 2 video cards, to just "get it right", nip it in the
bud at the texture mapping and filtering stages rather than when the scene
is being rendered.



  #5  
Old July 26th 05, 12:19 AM
Doug
external usenet poster
 
Posts: n/a
Default

Magnulus, what is "inherent moire"? It would seem to me the moire effect
would be visible on either a LCD or a CRT?

--
there is no .sig
"Magnulus" wrote in message
. ..
Another possibility is that this stuff is not visible at all on a regular
monitor- perhaps they are just too blurry. An LCD monitor has a fixed
aspect ratio, has no inherent moire, and so on. Perhaps this stuff has
been there all along and nobody has really payed attention to it. It's
definitely a subtle effect and if you are busy fragging you probably won't
notice it.

It's interesting that ATI and NVidia are both pushing SLI/Crossfire cards
for their many image quality improvements. One of the improvements is
"texture quality", they often cite. Ie, reducing crawling textures.
Well, it would make more sense to me, rather than using a supersample
anti-aliasing mode and 2 video cards, to just "get it right", nip it in
the bud at the texture mapping and filtering stages rather than when the
scene is being rendered.





  #6  
Old July 26th 05, 01:00 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

| It would seem to me the moire effect would be
| visible on either a LCD or a CRT?
_____

Or a function of how a color NTSC signal is decoded into R,G,B because of
interference between the color subcarrier and luminance information. But
surely the original poster is not viewing NTSC (composite) output on a color
television!

Phil Weldon

"Doug" wrote in message
. ..
Magnulus, what is "inherent moire"? It would seem to me the moire effect
would be visible on either a LCD or a CRT?

--
there is no .sig
"Magnulus" wrote in message
. ..
Another possibility is that this stuff is not visible at all on a
regular monitor- perhaps they are just too blurry. An LCD monitor has a
fixed aspect ratio, has no inherent moire, and so on. Perhaps this stuff
has been there all along and nobody has really payed attention to it.
It's definitely a subtle effect and if you are busy fragging you probably
won't notice it.

It's interesting that ATI and NVidia are both pushing SLI/Crossfire
cards for their many image quality improvements. One of the improvements
is "texture quality", they often cite. Ie, reducing crawling textures.
Well, it would make more sense to me, rather than using a supersample
anti-aliasing mode and 2 video cards, to just "get it right", nip it in
the bud at the texture mapping and filtering stages rather than when the
scene is being rendered.







  #7  
Old July 26th 05, 04:42 AM
Magnulus
external usenet poster
 
Posts: n/a
Default

You can get moire with CRT's, especially the cheaper ones, or ones that
are poorly adjusted or out of focus. With an LCD, you get no moire,
especially with a digital signal.


  #8  
Old July 25th 05, 09:40 AM
de Moni
external usenet poster
 
Posts: n/a
Default

deimos wrote:
Your registry settings might be whacked. Make certain you're using the
newest drivers (77.76), then with coolbits on, click Restore under


AFAIK newest official drivers are still 77.72's. At least _I_ wouldn't
install any BETA drivers...
  #9  
Old July 25th 05, 05:29 PM
deimos
external usenet poster
 
Posts: n/a
Default

de Moni wrote:
deimos wrote:

Your registry settings might be whacked. Make certain you're using
the newest drivers (77.76), then with coolbits on, click Restore under



AFAIK newest official drivers are still 77.72's. At least _I_ wouldn't
install any BETA drivers...


To be perfectly honest, the 77.72 officials were a disaster. I've been
using every driver version since before the first Detonators (2.04) and
these were the most bugged in recent memory.

The 77.76 betas are mainly fixes (including a memory leak from .72!) and
usually the worst that comes from an nZone beta driver is usually WHQL
non-compliance.
  #10  
Old July 25th 05, 10:51 PM
de Moni
external usenet poster
 
Posts: n/a
Default

deimos wrote:
To be perfectly honest, the 77.72 officials were a disaster. I've been
using every driver version since before the first Detonators (2.04) and
these were the most bugged in recent memory.


Funny, because I haven't had even a single issue 77.72's... 6600GT.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Canon S520. Printing problem when on High Quality, not with Std Quality [email protected] Printers 3 May 21st 05 11:50 PM
What gaming monitor would you buy - $1000? boe Ati Videocards 16 April 23rd 05 06:39 PM
GIGABYTE TECHNOLOGY receives highest honors—15th Annual National Quality Award Gigabyte USA Marketing Gigabyte Motherboards 0 November 4th 04 08:35 PM
Acronis True Image 7.0: Image verify, and bootable DVD image write? Matt Storage (alternative) 11 January 16th 04 12:42 AM
MP3s to CD: does "on-the-fly" burning give same audio quality as doing it in 2 steps?!.. Anonymous Joe Cdr 15 October 27th 03 07:55 AM


All times are GMT +1. The time now is 07:08 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.