A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Brilinear filtering.....What!!



 
 
Thread Tools Display Modes
  #11  
Old November 2nd 03, 03:20 PM
Darkfalz
external usenet poster
 
Posts: n/a
Default

"somnambulist" wrote in message
...
Nic wrote:
I can tell the difference though. Which is why I've got a Radeon
9700 Pro


Me neither - none at all.


Which is why IYHO, YMMV and all that crap.


The only really horrible quality of NVIDIA is their 16 bit post filtering.
Coming from a Voodoo5, where the difference between 16 bit and 32 bit was
hardly noticible (except in performance), I couldn't believe how ugly and
obvious the 16 bit dithering matrix was on my new Geforce FX. It renders 16
bit unusable on NVIDIA cards (I know I know, we should be using 32 bit
anyway).


  #12  
Old November 2nd 03, 05:39 PM
Lenny
external usenet poster
 
Posts: n/a
Default


The only really horrible quality of NVIDIA is their 16 bit post filtering.


....Which isn't so strange, because it doesn't HAVE any!

Coming from a Voodoo5, where the difference between 16 bit and 32 bit was
hardly noticible (except in performance)


Well, it would be considerably more noticeable if you ran some modern
software on the thing that really uses lots of texture layers. Older
software was understandably more conservative in the way of such things,
since fillrate didn't exactly grow on trees back then.

I couldn't believe how ugly and
obvious the 16 bit dithering matrix was on my new Geforce FX.


Why are you running games in 16-bit anyway? It's not any faster anyway, or
at least not more than marginally.


  #13  
Old November 2nd 03, 06:29 PM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"Jack" wrote in message
...
Hi there



BTW. written when drunk. So commend me.



The original article is an exercise in eye-strain and brain strain. Damn
teutonic translators. Only a German could be so incourteous to his verbs.

Oh, and btw, post-filter anisotropic filtering kicks ass..this 20th century
bi-linear and tri-linear crap is for luddites.



  #14  
Old November 2nd 03, 06:46 PM
somnambulist
external usenet poster
 
Posts: n/a
Default

Lenny wrote:
I can tell the difference though. Which is why I've got a Radeon
9700 Pro


Me neither - none at all.


You probably don't know what to look for. The diff between bilinear
and trilinear is as night and day once it's been pointed out to you.


Just a little bit!!

--
somnambulist


  #15  
Old November 3rd 03, 12:04 AM
phobos
external usenet poster
 
Posts: n/a
Default

Jack wrote:
Hi there

I was amazed reading this.
http://www.3dcenter.org/artikel/2003..._a_english.php
Nvidia has planned bad pic quality all along.
It`s just numbers with them apparantly,
Well excuse me then for giving them a number for quality performance. A
zero.
And another one for cheating...a very high one.

I think new vid card buyers, gamers especially must look to others to find
their card (ATI)

Way to go Nvidia.....make it even worse and see if you can get away with it.
They could try.

BTW. written when drunk. So commend me.

BYE

Jack



I read that, and I think a lot of these sites are WAY overemphasizing
filtering stages and confusing bilinear/trilinear filtering. They're
acting as if NVidia is just throwing away trilinear filtering.

What's happening (especially when these sites post shots showing colored
mip bands) is that instead of filtering every single texel, often you
don't need to. It's an error level. For example, if a pixel was within
a value of say 0.03 to it's next nearst filtered neighbor, you can
safely discard it's filtering without affecting end image quality. Or
you can calculate every single sample and come out with the exact same
visual result.

This kind of optimization is a good blend of mathematical reduction
along with some very unnoticable visual degredation under extreme
circumstances (such as reviewers nitpicking mip banding shots). In real
game play it makes no difference. This is the kind of balanced decision
every video card manufacturer should NOT be afraid to make, but ARE due
to the weasel word "cheating" being tossed about when video drivers are
concerned.

Another practical consideration for this change in filtering is that NV
analyzed the vast majority of all situations in which you can safely
adjust the error bounds of the mipmap levels without affecting image
quality and significantly reduce the load on anisotropic filtering.

Anyone who has done a bit of raytracing with such programs as POV-Ray,
etc. will know this concept of an error level. You don't need to trace
every single texel, just if it's color is "a=0.1" to the neighboring
texel and speed up rendering. Same applies with fillrate and video cards.


  #16  
Old November 3rd 03, 12:08 AM
phobos
external usenet poster
 
Posts: n/a
Default

Lenny wrote:

The only really horrible quality of NVIDIA is their 16 bit post filtering.



...Which isn't so strange, because it doesn't HAVE any!


Coming from a Voodoo5, where the difference between 16 bit and 32 bit was
hardly noticible (except in performance)



Well, it would be considerably more noticeable if you ran some modern
software on the thing that really uses lots of texture layers. Older
software was understandably more conservative in the way of such things,
since fillrate didn't exactly grow on trees back then.


I couldn't believe how ugly and
obvious the 16 bit dithering matrix was on my new Geforce FX.



Why are you running games in 16-bit anyway? It's not any faster anyway, or
at least not more than marginally.



FYI for whoever cares -- the NVidia/3DFX acquisition was strictly over
ip property, patents, and algorithems. None of the previous hardware
features from the Voodoo series were "ported over" to the Geforce FX.

No 22-bit post filter, no TMU design, nothing. The connection lies
solely in the unerlying chip design, some culling methods (likely), and
possibly some MSAA algorithems integrated into new FSAA modes.


  #17  
Old November 3rd 03, 02:26 AM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"phobos" wrote in message
...


What's happening (especially when these sites post shots showing colored
mip bands) is that instead of filtering every single texel, often you
don't need to. It's an error level. For example, if a pixel was within
a value of say 0.03 to it's next nearst filtered neighbor, you can
safely discard it's filtering without affecting end image quality. Or
you can calculate every single sample and come out with the exact same
visual result.

This kind of optimization is a good blend of mathematical reduction
along with some very unnoticable visual degredation under extreme
circumstances (such as reviewers nitpicking mip banding shots). In real
game play it makes no difference.


Regrettably, this isn't true in all applications. One of the more immersion
destroying artifacts, and a direct result of this driver behavior, is the
pervasive texture shimmering (texture aliasing) in FS9. What happens is
exactly as the article author describes as the 'bow wave' of textel error
(that you describe): As the angle of incidence increases to the textel,
chances increase that the textel will change color, and then revert, and go
through this rapid oscillation until the textel is firmly within a 'band'.

All this occurs regardles of frame rate, fill rate, or bandwidth
availability. It's a bad mathematical function that is *not* representative
of the world we see. In most cases, lighting and other texture processing
can make the textel shifting moot, but in FS9, it can be a real deal
breaker.

There is a solution, and it's ansiotropic filitering, but for most people, I
suspect the hardware can't deal with all that sampling and maintain their
current levels of performance. Bad filtering = forced upgrade! That's the
co-conspirator in me talking.





  #18  
Old November 3rd 03, 10:39 AM
Falkentyne
external usenet poster
 
Posts: n/a
Default

On Sun, 2 Nov 2003 14:33:00 +0100, "Flow"
enlightened us by scribbling this gem of wisdom:

I allways found nvidia cards have bad image quality.
Never the less i bought some of them and they are fast.
And i will keep on buying them if they hold up to what they have been doing
allready.


Now where exactly have I heard these words before......?




Gibs When you kill 6 people in Unreal Tournament
it is "MonsterKill", In Quake3 it is "Excellent",
in Counter-Strike it is "Kicked by console"
  #19  
Old November 4th 03, 04:32 AM
Falkentyne
external usenet poster
 
Posts: n/a
Default

On Mon, 03 Nov 2003 02:26:44 GMT, "Derek Wildstar"
enlightened us by scribbling this gem of wisdom:


Regrettably, this isn't true in all applications. One of the more immersion
destroying artifacts, and a direct result of this driver behavior, is the
pervasive texture shimmering (texture aliasing) in FS9. What happens is
exactly as the article author describes as the 'bow wave' of textel error
(that you describe): As the angle of incidence increases to the textel,
chances increase that the textel will change color, and then revert, and go
through this rapid oscillation until the textel is firmly within a 'band'.

All this occurs regardles of frame rate, fill rate, or bandwidth
availability. It's a bad mathematical function that is *not* representative
of the world we see. In most cases, lighting and other texture processing
can make the textel shifting moot, but in FS9, it can be a real deal
breaker.

There is a solution, and it's ansiotropic filitering, but for most people, I
suspect the hardware can't deal with all that sampling and maintain their
current levels of performance. Bad filtering = forced upgrade! That's the
co-conspirator in me talking.


You have a valid point there.
Indeed, Anisotropic filtering (on the Ti series cards) is simply too
hard on the hardware in almost any game made within 2003-present.
Great in older games, but even then, turn on 4x FSAA and 8x AF, and
you will have trouble even in games going back as far as 1999,
depending on the resolution, of course.

Sure, the FX cards can do better, but on the newest games, even this
isn't going to be possible. (try playing 1024x768 6x FSAA 8x (or 16
x?) AF, in Half Life 2 or Doom3 ...good luck!. Though admittedly it
was the same when the GF4 hit the streets 4x FSAA; 1024x768 and even
higher was great in every game on the market, until UT2003 came along.

Anyway, the point is, you should, at the LEAST, have an option to turn
on full image quality, if you _WANT_ to, and not have lesser quality
forced upon you, saying "the user doesn't need it/ can't tell the
difference".

I hate to say it, but this is "3dfx" all over again.

Now yes, 3dfx's 16 bit filtering was the *BEST* out of any video card,
(even better than ATI's, which is better than any Nvidia cards--which
have NONE), but you remember their "32 bit is too hard on the
hardware, and you can't tell the difference, easily, anyway"--gamers
don't need it right now...they need speed. " Framerate is king"....

Hmm...

Of course, back then, games were very forgiving as to the # of texture
layers, and that, combined with the TNT/GFs lack of 16 bit filtering,
made, for example, Unreal on a Voodoo2 at 16 bit, look just as good as
on a TNT @ 32 bit (if it even RAN...). But developers had their hands
tied, in making 16 bit look good. Remember Carmack's article about
the precision errors in 32 bit rendering as compared to 128 bit ?

Anyway, why doesn't quality optimizations turn on full trilinear (also
being allowed in Rivatuner), like in the older drivers?

And the OTHER people saying how "if they wanted image quality, they
would have gone with ATI", is just TOTAL teenager bullsh*t.

Those SAME stupid kids were BASHING 3dfx for their 16 bit, not having
the superiour 32 bit quality (pre-V5), and now theyre saying highest
IQ isn't important, now that they don't have it, and ATI does. Can
you say......FANBOY?

Some people just aren't very clever....


Gibs When you kill 6 people in Unreal Tournament
it is "MonsterKill", In Quake3 it is "Excellent",
in Counter-Strike it is "Kicked by console"
  #20  
Old November 4th 03, 01:42 PM
Granulated
external usenet poster
 
Posts: n/a
Default

On Sun, 02 Nov 2003 15:05:31 GMT "Lenny" meeped :


I can tell the difference though. Which is why I've got a Radeon 9700

Pro

Me neither - none at all.


You probably don't know what to look for. The diff between bilinear and
trilinear is as night and day once it's been pointed out to you.



especially with regards to MIP banding "effects"
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
my new mobo o/c's great rockerrock Overclocking AMD Processors 9 June 30th 04 08:17 PM
Anisotropic filtering a waste of GPU cycles? Wblane Ati Videocards 36 February 28th 04 01:45 PM
Pixel shader enabled = filtering problems, odd? Martin Crozier Ati Videocards 3 January 12th 04 05:27 PM
FS2002 Filtering - Black squares around lights/trees Alan Hinchcliffe Nvidia Videocards 1 August 20th 03 06:54 PM
Anti-aliasing and Anisotropic Filtering Anders Albrechtsen Ati Videocards 0 July 13th 03 10:41 PM


All times are GMT +1. The time now is 08:36 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.