A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

need quality



 
 
Thread Tools Display Modes
  #1  
Old August 22nd 04, 10:51 PM
Mega Man
external usenet poster
 
Posts: n/a
Default need quality

how does nvidia stack up against ati as far as texture quality?


  #2  
Old August 23rd 04, 01:18 PM
GTX_SlotCar
external usenet poster
 
Posts: n/a
Default

"Mega Man" wrote in message
news:1A8Wc.3015$VY.2459@trndny09...
how does nvidia stack up against ati as far as texture quality?



Both good. I think it depends on which drivers you're using with each one.

Gary



  #3  
Old August 23rd 04, 06:06 PM
deimos
external usenet poster
 
Posts: n/a
Default

Mega Man wrote:

how does nvidia stack up against ati as far as texture quality?



Excellent if not greatly better. ATI's recent hardware has skimped on
internal texture filtering precision as a method of reducing total chip
workload. It helps to increase overall chip efficiency, but the side
effect is increased aliasing.

Most image quality comparisons incorrectly interpret aliasing and
lightly filtered texels as "sharpness" in screenshots. This however is
not the correct way the image should be rendered.

Internally, the R3xx and above use 5-bit precision mipmaps at greater
than L1. This allows for many fewer gradients and overall lower quality
filtering as the angle of incidence to a plane decreases. NVIDIA uses a
de-facto standard of 8-bit precision (started by SGI's OpenGL reference).

Also ATI uses an adaptive trilinear technique that is cannot be disabled
by the user or developer. This can hold increases up to 30% when
anisotropic filtering is enabled. NVidia has some similar techniques
(that use a different method though), BUT with a big difference in that
you have a setting that disables either trilinear or aniostropic
optimizations.

Differencing algorithems do not show a great difference between older
R250 ingame screenshots (that have no adaptive trilinear) and the R3xx,
but overall it contributes to much greater texture aliasing, especially
during movement.

Currently I'm using a FX5600 (the series that invented "brilinear"
filtering as a aniso optimization), but with the newest drivers I can
disable both tri and af optimizations and it looks absolutely beatiful.

Brilinear has evolved in the GF6 series however and you should not think
twice about using the optimization on those cards as it looks MUCH MUCH
better and doesn't detract from image quality when moving.
  #4  
Old August 23rd 04, 08:25 PM
GTX_SlotCar
external usenet poster
 
Posts: n/a
Default

......This allows for many fewer gradients and overall lower quality
filtering as the angle of incidence to a plane decreases.

But doesn't aliasing become less apparent as the angle decreases anyway?
That's what I've read and observed. So why not give up some filtering
quality that's not needed to gain extra performance?
Right now I have 2 ATI cards and 2 nVidia cards (Radeon 8500, Ti4400, Radeon
9800Pro and eVga 6800GT.) I figure I'll become brand loyal when one of those
companies becomes Me loyal.
One of the things I liked about ATI cards was that they seemed to have more
gradients (smoother transitions between colors) than the nVidia cards. Not
just in games, but in 2D also, in pictures and the desktop. Low quality
pictures, especially, looked better on the ATI.
Regardless, that's changed with the 62.xx series drivers, especially with
the 6800 series cards. The Ti4400 with 65.62 drivers is very nice, and the
6800gt is even better. I can honestly say that I think the texture quality
between ATI and nVidia is even now. Until ATI doesn't something with the
poor performance of their OpenGL drivers, nVidia is definitely the way to go
for any newer games with that format.
Gary

--
Tweaks & Reviews
www.slottweak.com





  #5  
Old August 24th 04, 01:25 PM
deimos
external usenet poster
 
Posts: n/a
Default

GTX_SlotCar wrote:
......This allows for many fewer gradients and overall lower quality
filtering as the angle of incidence to a plane decreases.


But doesn't aliasing become less apparent as the angle decreases anyway?
That's what I've read and observed. So why not give up some filtering
quality that's not needed to gain extra performance?
Right now I have 2 ATI cards and 2 nVidia cards (Radeon 8500, Ti4400, Radeon
9800Pro and eVga 6800GT.) I figure I'll become brand loyal when one of those
companies becomes Me loyal.
One of the things I liked about ATI cards was that they seemed to have more
gradients (smoother transitions between colors) than the nVidia cards. Not
just in games, but in 2D also, in pictures and the desktop. Low quality
pictures, especially, looked better on the ATI.
Regardless, that's changed with the 62.xx series drivers, especially with
the 6800 series cards. The Ti4400 with 65.62 drivers is very nice, and the
6800gt is even better. I can honestly say that I think the texture quality
between ATI and nVidia is even now. Until ATI doesn't something with the
poor performance of their OpenGL drivers, nVidia is definitely the way to go
for any newer games with that format.
Gary


Perhaps I worded it wrongly, but by decreasing angle, I'm referring to a
"sun on the horizon" situation. At a certain point, all the mipmaps
become too small anyhow. But generally with 90 degree fov games, we're
observing at a 30 degree agle to the floor plain and that's where I'm
coming from.

The absolute ideal situation is that of a raytraced scene taking point
many point samples for each given pixel and oversampling, but trilinear
filtering is used as an abstraction of that concept (the contribution of
neighboring texels as distance inreases and apparent resolution
decreases, such as the ability to decern details at X number of arc
seconds).

3dcenter.org has an excellent article detailing ATI's filtering methods.

You won't have to worry about 2d and overall image quality anymore with
NVidia based cards. NVidia as a company has never actually designed
cards, they've always sold chip designs. The chips are then made by a
foundry (TSMC, IBM) and sold to card makers. Then manufacturers take
the reference design and create their cards based on it. As such,
historically NVidia has been unable to achieve great quality control of
the final retail product.

That has changed however. There are now set standards for what
components (2d rf filters, ramdac's, etc) and specifications a given 3rd
party manufacturer can choose. This has led to a huge increase of
overall consitent quality. This system was implemented around the time
of the Geforce FX and ever since.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Poor quality from OKI C7350 with S700 Scancopier Tonny Iversen General 0 December 3rd 04 12:26 PM
GIGABYTE TECHNOLOGY receives highest honors—15th Annual National Quality Award Gigabyte USA Marketing Gigabyte Motherboards 0 November 4th 04 07:35 PM
Parhelia Display Quality vs. G550, or Radeon 8500. Frederic W. Erk Matrox Videocards 10 February 7th 04 01:07 PM
MP3s to CD: does "on-the-fly" burning give same audio quality as doing it in 2 steps?!.. Anonymous Joe Cdr 15 October 27th 03 06:55 AM
Detonator Image Quality Test: Part II magnulus Nvidia Videocards 10 September 18th 03 12:10 PM


All times are GMT +1. The time now is 08:11 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.