A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

16 vx 32 bit color on LCD Monitors



 
 
Thread Tools Display Modes
  #1  
Old July 20th 05, 07:27 PM
Pat
external usenet poster
 
Posts: n/a
Default 16 vx 32 bit color on LCD Monitors

Hey everyone, I've got a question.

Since LCD monitors are only capable of 16 bit color, is there any
advantage to running your video card in 32 bit mode?

I know there used to be a considerable difference in 16 vs 32 bit
performance.

So bottom line question is, I'm using a Geforce FX 5900 XT and I'm about
to get a 19 inch LCD monitor. Should I change the card to 16 bit to
improve performance? And will that cause any change in what I see on the
screen?

Pat

Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com
  #2  
Old July 20th 05, 10:44 PM
Magnulus
external usenet poster
 
Posts: n/a
Default

There is a difference. 16 bit color doesn't look as good on an LCD.


  #3  
Old July 21st 05, 08:58 PM
Robert Hancock
external usenet poster
 
Posts: n/a
Default

Pat wrote:
Hey everyone, I've got a question.

Since LCD monitors are only capable of 16 bit color, is there any
advantage to running your video card in 32 bit mode?


LCDs are capable of more than 16-bit color. 8-bit LCDs can do true
32-bit (really 24-bit) color. 6-bit LCDs effectively do 18-bit color and
interpolate it up to 24 bits.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from
Home Page:
http://www.roberthancock.com/
  #4  
Old July 21st 05, 11:05 PM
First of One
external usenet poster
 
Posts: n/a
Default

Interesting how few people complain about 6-bit LCDs now. I remember the
Voodoo3 getting a lot of flak for not being able to display 32-bit color.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Robert Hancock" wrote in message
news:S7TDe.8328$5V4.5596@pd7tw3no...
LCDs are capable of more than 16-bit color. 8-bit LCDs can do true 32-bit
(really 24-bit) color. 6-bit LCDs effectively do 18-bit color and
interpolate it up to 24 bits.



  #5  
Old July 22nd 05, 12:16 AM
Zulu
external usenet poster
 
Posts: n/a
Default

First of One wrote:
Interesting how few people complain about 6-bit LCDs now. I remember
the Voodoo3 getting a lot of flak for not being able to display
32-bit color.


Indeed interesting. But...
Most people simply don't know!

And manufactors are not very keen to inform about which panels are used in
the different monitors.

(Fake) response times is all that matters to the average users.

It is in the same box as "The megahertz myth" and "The megapixel myth"...

Zulu


  #6  
Old July 22nd 05, 04:41 AM
Robert Hancock
external usenet poster
 
Posts: n/a
Default

First of One wrote:
Interesting how few people complain about 6-bit LCDs now. I remember the
Voodoo3 getting a lot of flak for not being able to display 32-bit color.


Well, it's not quite as bad as that - I believe the displays flicker the
pixels back and forth between the nearest values to approximate up to
the full 8 bits per color. I believe the color rendition is still not as
good though.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from
Home Page:
http://www.roberthancock.com/
  #7  
Old July 22nd 05, 04:48 AM
Arthur Hagen
external usenet poster
 
Posts: n/a
Default

Robert Hancock wrote:
Pat wrote:
Hey everyone, I've got a question.

Since LCD monitors are only capable of 16 bit color, is there any
advantage to running your video card in 32 bit mode?


LCDs are capable of more than 16-bit color. 8-bit LCDs can do true
32-bit (really 24-bit) color. 6-bit LCDs effectively do 18-bit color
and interpolate it up to 24 bits.


Also, the number of colours it can handle doesn't equate with the number
of *visible* colours. LCD displays still have a way to go before they
can display as many nuances as a good CRT, or with the same fidelity.

There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
which case a CRT is currently the only choice.

Regards,
--
*Art

  #8  
Old July 23rd 05, 02:35 AM
First of One
external usenet poster
 
Posts: n/a
Default

The good ol. Parhelia... Always wondered under what circumstances 30-bit
color can be useful. Common image formats like BMP, PNG, JPEG, etc. only
store 24-bits. Does TIFF or RAW allow 30-bit?

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Arthur Hagen" wrote in message
...
There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
which case a CRT is currently the only choice.

Regards,
--
*Art



  #9  
Old July 23rd 05, 03:28 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

Some scanners use 36 bit (12,12,12) or even 48 bit color (16,16,16). For
these devices image quality is much more important than speed, and
supersampling color allows better downsampling.

Phil Weldon


"First of One" wrote in message
...
The good ol. Parhelia... Always wondered under what circumstances 30-bit
color can be useful. Common image formats like BMP, PNG, JPEG, etc. only
store 24-bits. Does TIFF or RAW allow 30-bit?

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Arthur Hagen" wrote in message
...
There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
which case a CRT is currently the only choice.

Regards,
--
*Art





  #10  
Old July 23rd 05, 04:50 AM
Arthur Hagen
external usenet poster
 
Posts: n/a
Default

First of One wrote:
The good ol. Parhelia... Always wondered under what circumstances
30-bit color can be useful. Common image formats like BMP, PNG, JPEG,
etc. only store 24-bits. Does TIFF or RAW allow 30-bit?


JPEG doesn't store in 24-bits as such -- of course, if the source is
24-bit, it can't be better than that, but if the source is higher
quality (like from a scanner), and your jpeg decompresser allows it, you
can surely use higher than 8+8+8. PNG is supposed to be expandable, so
I would be surprised if it can't handle more than 8+8+8.

Regards,
--
*Art

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
A question about dual monitors Woodsy General 1 March 11th 05 07:17 AM
Problems with dual monitors ModeratelyConfused Homebuilt PC's 10 December 10th 04 11:31 PM
3 monitors on XP w/SP2 how to get them to work ? Ronald Ortiz Nvidia Videocards 2 October 27th 04 04:09 AM
LCD monitors | valuing options Evan Cooch General 10 February 10th 04 08:40 PM
Dual monitors: 'Allow taskbar to span multiple monitors' Patrick Flaherty Nvidia Videocards 2 July 15th 03 03:46 PM


All times are GMT +1. The time now is 08:36 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.