A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

any point to setting LCD to 32bit color?



 
 
Thread Tools Display Modes
  #1  
Old October 28th 03, 05:48 PM
RL
external usenet poster
 
Posts: n/a
Default any point to setting LCD to 32bit color?

See subject. TIA


  #2  
Old October 28th 03, 08:00 PM
Lenny
external usenet poster
 
Posts: n/a
Default


See subject. TIA


Yes, for many reasons, though they require quite some effort to explain why.

Just do it, it's better that way.


  #3  
Old October 29th 03, 04:29 AM
RL
external usenet poster
 
Posts: n/a
Default

Humor me and explain the many reasons. Just write slowly so I can follow.


"Lenny" wrote in message
...

See subject. TIA


Yes, for many reasons, though they require quite some effort to explain

why.

Just do it, it's better that way.





  #4  
Old October 29th 03, 05:31 AM
phobos
external usenet poster
 
Posts: n/a
Default

RL wrote:
Humor me and explain the many reasons. Just write slowly so I can follow.


"Lenny" wrote in message
...

See subject. TIA


Yes, for many reasons, though they require quite some effort to explain


why.

Just do it, it's better that way.







For 2d, the extra precision allows your card to display full color
images without internal dithering or banding artifacts. Also, in 16-bit
color, it's 565 RGB; one extra bit is given to green, since the human
eye is more sensitive to that wavelength. But 24-bit color (32-bit is
only 24 with a 8-bit alpha) is 888 RGBA (8-bits each channel and an
alpha for blending).

Even though a LCD may have a low contrast ratio or be able to display
only so many thousands of colors (response time, etc), all the color
conversion should be done by the LCD itself. It's colorspace conversion
will be much better and images will look practically the same as they
would on a good 24-bit CRT.

For 3D games the extra color precision is even more necessary. The
Z-Buffer needs that extra bits to precisely calculate where any given
vertex is in space from your viewpoint. And 16-bit dithering looks
horrible on all cards, even the best ones.

16-bits is limiting overall, so remember the old addage "Garbage In -
Garbage Out" and start with the best source possible.

  #5  
Old October 29th 03, 08:20 AM
Sith Lord
external usenet poster
 
Posts: n/a
Default

On Tue, 28 Oct 2003 23:31:58 -0600 in
alt.comp.periphs.videocards.nvidia, phobos
wrote:

RL wrote:
Humor me and explain the many reasons. Just write slowly so I can follow.


[brilliant explanation snipped for brevity]

Gee, I remember upgrading my VGA card to a Tseng Labs ET4000 and
seeing a picture of Sharon Stone in 256 colours for the first time, I
thought it was great. :-)

  #6  
Old October 29th 03, 11:11 AM
Darthy
external usenet poster
 
Posts: n/a
Default

On Wed, 29 Oct 2003 08:20:00 +0000, Sith Lord bouncer@localhost
wrote:

On Tue, 28 Oct 2003 23:31:58 -0600 in
alt.comp.periphs.videocards.nvidia, phobos
wrote:

RL wrote:
Humor me and explain the many reasons. Just write slowly so I can follow.


[brilliant explanation snipped for brevity]

Gee, I remember upgrading my VGA card to a Tseng Labs ET4000 and
seeing a picture of Sharon Stone in 256 colours for the first time, I
thought it was great. :-)


yeah yeah... BIG DEAL... we were looking at 4096 color photos on our
AMIGAs back in the 1980s!! 256colors... blah!


--
Remember when real men used Real computers!?
When 512K of video RAM was a lot!

Death to Palladium & WPA!!
  #7  
Old October 29th 03, 11:12 AM
Darthy
external usenet poster
 
Posts: n/a
Default

On Tue, 28 Oct 2003 17:48:05 GMT, "RL"
wrote:

See subject. TIA



Looks better... and er... why not?

Can be SLOWER to be in 16bit mode than 24/32bit mode.


--
Remember when real men used Real computers!?
When 512K of video RAM was a lot!

Death to Palladium & WPA!!
  #8  
Old October 29th 03, 05:06 PM
Derek Wildstar
external usenet poster
 
Posts: n/a
Default


"phobos" wrote in message
...

16-bits is limiting overall, so remember the old addage "Garbage In -
Garbage Out" and start with the best source possible.



It's about time someone with a clue has something to say.

Thanks for the detail.


  #9  
Old October 29th 03, 06:49 PM
Sith Lord
external usenet poster
 
Posts: n/a
Default

On Wed, 29 Oct 2003 11:11:53 GMT in
alt.comp.periphs.videocards.nvidia, Darthy
wrote:

yeah yeah... BIG DEAL... we were looking at 4096 color photos on our
AMIGAs back in the 1980s!! 256colors... blah!


I was looking at a lot of green back then, either on an Amstrad CPC664
or a Vax 11/750 (although some terminals were orange).

  #10  
Old October 30th 03, 04:10 AM
i'm_tired
external usenet poster
 
Posts: n/a
Default

phobos wrote:
RL wrote:
Humor me and explain the many reasons. Just write slowly so I can
follow.


"Lenny" wrote in message
...

See subject. TIA

Yes, for many reasons, though they require quite some effort to
explain


why.

Just do it, it's better that way.







For 2d, the extra precision allows your card to display full color
images without internal dithering or banding artifacts. Also, in
16-bit color, it's 565 RGB; one extra bit is given to green, since
the human
eye is more sensitive to that wavelength. But 24-bit color (32-bit is
only 24 with a 8-bit alpha) is 888 RGBA (8-bits each channel and an
alpha for blending).

Even though a LCD may have a low contrast ratio or be able to display
only so many thousands of colors (response time, etc), all the color
conversion should be done by the LCD itself. It's colorspace
conversion will be much better and images will look practically the
same as they
would on a good 24-bit CRT.

For 3D games the extra color precision is even more necessary. The
Z-Buffer needs that extra bits to precisely calculate where any given
vertex is in space from your viewpoint. And 16-bit dithering looks
horrible on all cards, even the best ones.

16-bits is limiting overall, so remember the old addage "Garbage In -
Garbage Out" and start with the best source possible.


Just some small additions to your well-described information: (as it applies
to 3D)

16 bits was indeed limiting . The Z-Buffer is 16 bits while in 16 bit
color-mode, but it is 24 bits in 32 bit color-mode. Therefore, the
Stencil Buffer comes into play to possibly generate the remaining 8 bits to
create 32 bit. The Stencil Buffer is usually only used as a marker for
re-usable pixels or for something like a reflection or perferated surface.

32 bit color rendering is like this: 64bits per pixel which translates down
to 8 bit red, 8 bit blue, 8 bit green, 8 bit Alpha, 24 bit Z-Buffer and the
remaining can be programed with the Stencil Buffer.

So, when comparing 32 bit and 16 bit, one might notice quickly that the
difference in Z-Bits is very important. The more Z-Bits there are, the more
accurately a pixel can be represented in visual concepts such as depth.
Z-Depth tells the renderer if any particular pixel blends with a transparant
pixel or if it goes in front of the transpart pixel to block it out.

And just to explain what giving the extra bit to green in 16 bit actually
means: It means that there are 32 different shades of Green and Blue but
there are 64 shades of Green. Even though the eye is more sensative to
green, this does leave the spectrum lopsided. So, game programmers would
tend to program that extra bit to alpha, but that still has real live
limitations because that pixel can *only* be either opaque or translucent
and nothing else.

Lastly, even though it wasn't part of the topic, I'd like to mention 24 bit.
24 bit was the standard for some while when we had cards like the Diamond
Stealth and others of that generation. RAM holds data in 64 bit "chunks".
24 bit is therefore a waster of memory bandwidth. It is simply inefficient.
Both 32 bit and 24 bit use the same 8 bits red, 8 bits blue, and 8 bits
green. The remaining 8 bits for 32 bit are there for memory storage
efficiency, but as it turns out, they are programmed for Alpha and so now it
is easy to create effects like smoke or fog (clouds, mist, whatever).


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Balance Point, AGP Overclocking David B. Overclocking 6 April 19th 05 01:42 PM
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 01:04 AM
HP Deskjet 935C and no color printing? Karl General 0 August 28th 03 01:21 PM
Canot get 1280x1024 true color @ 85hz (WinXP) scott Nvidia Videocards 10 August 22nd 03 07:49 PM
Gainward s-video out = hot color flicker Erin Nvidia Videocards 0 July 4th 03 02:26 AM


All times are GMT +1. The time now is 10:21 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright Đ2004-2024 HardwareBanter.
The comments are property of their posters.