A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

PC DVI to HDTV question



 
 
Thread Tools Display Modes
  #1  
Old December 14th 06, 08:40 PM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
Doughboy
external usenet poster
 
Posts: 5
Default PC DVI to HDTV question

I'm trying to find out a bit about the 709 colour space that is used
to display HD material.

Does the PC adjust the colour to the 709 standard before it sends the
data, or does it only convert it from 32bit to 24bit? If it's done at
the PC end, is this done in software or by the graphics card?

Assuming the PC doesn't adjust the colour, is the PC (or any other
source device equipped with a DVI output) meant to send a signal to
the TV telling it to use the 709 colour space, or is the TV supposed
to always use that if it thinks it's receiving HD (720p/1080i)
material? If there is such a signal, is it sent by the software or
hardware?

If anyone can shed some light on these matters, it would be much
appreciated.

Doughboy
  #2  
Old December 14th 06, 11:48 PM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
First of One
external usenet poster
 
Posts: 312
Default PC DVI to HDTV question

Uhhh, I think you are over-analyzing the situation. If the TV has a DVI
digital input, the connected PC should see it as just another monitor. You
should then set it to the TV's native resolution and 32-bit color. All the
colorspace conversion crap should be done by the TV.

Whatever the TV uses internally may be entirely proprietary. With LCD
panels, there are 6-bit panels and 8-bit panels, and the PC doesn't care.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."

"Doughboy" wrote in message
...
I'm trying to find out a bit about the 709 colour space that is used
to display HD material.

Does the PC adjust the colour to the 709 standard before it sends the
data, or does it only convert it from 32bit to 24bit? If it's done at
the PC end, is this done in software or by the graphics card?

Assuming the PC doesn't adjust the colour, is the PC (or any other
source device equipped with a DVI output) meant to send a signal to
the TV telling it to use the 709 colour space, or is the TV supposed
to always use that if it thinks it's receiving HD (720p/1080i)
material? If there is such a signal, is it sent by the software or
hardware?

If anyone can shed some light on these matters, it would be much
appreciated.

Doughboy



  #3  
Old December 15th 06, 11:30 AM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
Doughboy
external usenet poster
 
Posts: 5
Default PC DVI to HDTV question

On Thu, 14 Dec 2006 18:48:05 -0500, "First of One" root@localhost
wrote:

Uhhh, I think you are over-analyzing the situation. If the TV has a DVI
digital input, the connected PC should see it as just another monitor. You
should then set it to the TV's native resolution and 32-bit color. All the
colorspace conversion crap should be done by the TV.

Whatever the TV uses internally may be entirely proprietary. With LCD
panels, there are 6-bit panels and 8-bit panels, and the PC doesn't care.


The Nvidia graphics driver detects the TV correctly as "Sony TV
(Digital Display)" and has an option "Treat display as HDTV" which is
ticked. I can then set it to 480p,720p or 1080i.

The TV's service menu lets you switch between the 601 colour space
(used for SDTV) and the 709 colour space (used for HDTV). Playing the
colour bars from the AVIA test DVD, you can clearly see the colours
look different when switching between the two.

I've got a feeling that the TV does the colour decoding itself with
broadcasts received by it's internal tuner, whilst external sources
have to do this before sending the video on to the TV.

Doughboy
  #4  
Old December 15th 06, 04:16 PM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
John Russell
external usenet poster
 
Posts: 7
Default PC DVI to HDTV question


"Doughboy" wrote in message
...
I'm trying to find out a bit about the 709 colour space that is used
to display HD material.

Does the PC adjust the colour to the 709 standard before it sends the
data, or does it only convert it from 32bit to 24bit? If it's done at
the PC end, is this done in software or by the graphics card?

Assuming the PC doesn't adjust the colour, is the PC (or any other
source device equipped with a DVI output) meant to send a signal to
the TV telling it to use the 709 colour space, or is the TV supposed
to always use that if it thinks it's receiving HD (720p/1080i)
material? If there is such a signal, is it sent by the software or
hardware?

If anyone can shed some light on these matters, it would be much
appreciated.

Doughboy


Windows has been using a "What you see is what you get" system for some
time. To facilitate this printer and monitors either use standard colour
spaces and identify them via Plug and Play, or you download a file
containing a unique colour space definition. So yes, a PC outputs different
RGB values depending upon the colour space required by the monitor.

If you have Powerstrip you can change the monitor colour space, and one of
those is HDTV.


  #5  
Old December 15th 06, 05:09 PM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
Doughboy
external usenet poster
 
Posts: 5
Default PC DVI to HDTV question

On Fri, 15 Dec 2006 16:16:43 -0000, "John Russell"
wrote:


"Doughboy" wrote in message
.. .
I'm trying to find out a bit about the 709 colour space that is used
to display HD material.

Does the PC adjust the colour to the 709 standard before it sends the
data, or does it only convert it from 32bit to 24bit? If it's done at
the PC end, is this done in software or by the graphics card?

Assuming the PC doesn't adjust the colour, is the PC (or any other
source device equipped with a DVI output) meant to send a signal to
the TV telling it to use the 709 colour space, or is the TV supposed
to always use that if it thinks it's receiving HD (720p/1080i)
material? If there is such a signal, is it sent by the software or
hardware?

If anyone can shed some light on these matters, it would be much
appreciated.

Doughboy


Windows has been using a "What you see is what you get" system for some
time. To facilitate this printer and monitors either use standard colour
spaces and identify them via Plug and Play, or you download a file
containing a unique colour space definition. So yes, a PC outputs different
RGB values depending upon the colour space required by the monitor.

If you have Powerstrip you can change the monitor colour space, and one of
those is HDTV.


So would you think, for example, that Windows knows that my Sony
34XBR800 CRT HDTV (my Nvidia graphics card recognises it as "Sony TV
(Digital Display)" uses the 709 colour space and would adjust the RGB
values it sends to the DVI port (which the TV is connected to) on this
basis?

What's confusing is the TV has a temporary Service Menu switch that
allows me to toggle the colour space between 601 and 709 (at least I
think that's what it's doing). Certainly the colours change in
appearance consistent with this theory and this is what led me to
believe that a source device could tell the display which colour space
to use as neccessary.

Or is it more likely that that the TV just uses these matrix's
internally to translate SD broadcasts to the display's native 709
colour space.

Doughboy
  #6  
Old December 16th 06, 10:31 AM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
John Russell
external usenet poster
 
Posts: 7
Default PC DVI to HDTV question


"Doughboy" wrote in message
...
On Fri, 15 Dec 2006 16:16:43 -0000, "John Russell"
wrote:


"Doughboy" wrote in message
. ..
I'm trying to find out a bit about the 709 colour space that is used
to display HD material.

Does the PC adjust the colour to the 709 standard before it sends the
data, or does it only convert it from 32bit to 24bit? If it's done at
the PC end, is this done in software or by the graphics card?

Assuming the PC doesn't adjust the colour, is the PC (or any other
source device equipped with a DVI output) meant to send a signal to
the TV telling it to use the 709 colour space, or is the TV supposed
to always use that if it thinks it's receiving HD (720p/1080i)
material? If there is such a signal, is it sent by the software or
hardware?

If anyone can shed some light on these matters, it would be much
appreciated.

Doughboy


Windows has been using a "What you see is what you get" system for some
time. To facilitate this printer and monitors either use standard colour
spaces and identify them via Plug and Play, or you download a file
containing a unique colour space definition. So yes, a PC outputs
different
RGB values depending upon the colour space required by the monitor.

If you have Powerstrip you can change the monitor colour space, and one of
those is HDTV.


So would you think, for example, that Windows knows that my Sony
34XBR800 CRT HDTV (my Nvidia graphics card recognises it as "Sony TV
(Digital Display)" uses the 709 colour space and would adjust the RGB
values it sends to the DVI port (which the TV is connected to) on this
basis?


When connected using DVI to the TV check the Colour Management TAb in
Display Properties. This will tell you what colour space profile is in use.

Nvidia "may" break the windows rules and ignore the selected Monitor Profile
if you select HDTV in the Nvidia setup, I don't know.

SD material supplied to the TV can't change it's Colour Profile. It doesn't
follow that even a HDTV set uses the HDTV colour profile at the RGB level.
All signals may get converted to the sets own RGB Colour Profile.




  #7  
Old December 16th 06, 11:56 AM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
John Russell
external usenet poster
 
Posts: 7
Default PC DVI to HDTV question


"John Russell" wrote in message
...

"Doughboy" wrote in message
...
On Fri, 15 Dec 2006 16:16:43 -0000, "John Russell"
wrote:


"Doughboy" wrote in message
...
I'm trying to find out a bit about the 709 colour space that is used
to display HD material.

Does the PC adjust the colour to the 709 standard before it sends the
data, or does it only convert it from 32bit to 24bit? If it's done at
the PC end, is this done in software or by the graphics card?

Assuming the PC doesn't adjust the colour, is the PC (or any other
source device equipped with a DVI output) meant to send a signal to
the TV telling it to use the 709 colour space, or is the TV supposed
to always use that if it thinks it's receiving HD (720p/1080i)
material? If there is such a signal, is it sent by the software or
hardware?

If anyone can shed some light on these matters, it would be much
appreciated.

Doughboy

Windows has been using a "What you see is what you get" system for some
time. To facilitate this printer and monitors either use standard colour
spaces and identify them via Plug and Play, or you download a file
containing a unique colour space definition. So yes, a PC outputs
different
RGB values depending upon the colour space required by the monitor.

If you have Powerstrip you can change the monitor colour space, and one
of
those is HDTV.


So would you think, for example, that Windows knows that my Sony
34XBR800 CRT HDTV (my Nvidia graphics card recognises it as "Sony TV
(Digital Display)" uses the 709 colour space and would adjust the RGB
values it sends to the DVI port (which the TV is connected to) on this
basis?


When connected using DVI to the TV check the Colour Management TAb in
Display Properties. This will tell you what colour space profile is in
use.

Nvidia "may" break the windows rules and ignore the selected Monitor
Profile if you select HDTV in the Nvidia setup, I don't know.

SD material supplied to the TV can't change it's Colour Profile. It
doesn't follow that even a HDTV set uses the HDTV colour profile at the
RGB level. All signals may get converted to the sets own RGB Colour
Profile.




According to this site:-
http://www.color.org/sRGB.html

....the SRGB IEC619666-2.1 profiles where designed to be complaint with 709.
If they don't come up as Profiles in the Colur Management tab you can
download them using the site linked.


  #8  
Old December 20th 06, 03:23 AM posted to alt.comp.periphs.videocards.nvidia,uk.tech.digital-tv
Doughboy
external usenet poster
 
Posts: 5
Default PC DVI to HDTV question

On Sat, 16 Dec 2006 10:31:05 -0000, "John Russell"
wrote:

When connected using DVI to the TV check the Colour Management TAb in
Display Properties. This will tell you what colour space profile is in use.


Windows will only let me look at these settings for the primary
monitor.

Nvidia "may" break the windows rules and ignore the selected Monitor Profile
if you select HDTV in the Nvidia setup, I don't know.


It seems that the 709 colour space is pretty close to the one used in
PC's, so there probably isn't the need to use a Profile, although it's
possible that one might be useful if one needed an exact match.
Obviously, properly calibrating the monitor would be more important.

SD material supplied to the TV can't change it's Colour Profile. It doesn't
follow that even a HDTV set uses the HDTV colour profile at the RGB level.
All signals may get converted to the sets own RGB Colour Profile.


Granted, but I imagine HDTV's expect to receive a 709 standard signal.

Doughboy
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Perhaps and off topic question....but could use some help with video question.....I don't need codec help, just a general question. Bret Miller Homebuilt PC's 0 October 13th 06 12:23 AM
non-chipped kim Printers 31 August 9th 06 03:00 AM
XP2500A+ HDTV Technology!!! @0, Audio/Video Product Nvidia Videocards 0 March 3rd 06 09:21 PM
XP2500A+ HDTV Technology!!! O;i Audio/Video Product Ati Videocards 0 March 3rd 06 09:21 PM
AIW 9600XT & HDTV Wonder & Thanks Bill Anderson Ati Videocards 1 February 19th 06 10:15 PM


All times are GMT +1. The time now is 11:57 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.