A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

1080p LCD to PC, cabling: VGA / SVGA / UXGA



 
 
Thread Tools Display Modes
  #1  
Old August 28th 08, 02:13 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.hardware,24hoursupport.helpdesk
Dennis[_2_]
external usenet poster
 
Posts: 18
Default 1080p LCD to PC, cabling: VGA / SVGA / UXGA

I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080

I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc

ie: http://www.infinitecables.com/vga.html

The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?

Any links or specific facts would be greatly appreciated.



  #2  
Old August 28th 08, 02:52 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.hardware,24hoursupport.helpdesk
Calab
external usenet poster
 
Posts: 153
Default 1080p LCD to PC, cabling: VGA / SVGA / UXGA


"Dennis" wrote in message
...
I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080

I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc

ie: http://www.infinitecables.com/vga.html

The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?


VGA is VGA. Unless you are going LOOOONG distances, you won't see a
difference.

You would probably see a difference if you switched to DVI/HDMI though,
since this is a digital signal.


  #3  
Old August 28th 08, 06:38 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.hardware,24hoursupport.helpdesk
Flyer
external usenet poster
 
Posts: 3
Default 1080p LCD to PC, cabling: VGA / SVGA / UXGA


"Dennis" wrote in message
...
I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080

I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc

ie: http://www.infinitecables.com/vga.html

The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?

Any links or specific facts would be greatly appreciated.


if your graphics card has DVI, use that with a DVI-HDMI converter and feed
it into the new monitor

P.


  #4  
Old August 28th 08, 06:53 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.hardware,24hoursupport.helpdesk
Dennis[_2_]
external usenet poster
 
Posts: 18
Default 1080p LCD to PC, cabling: VGA / SVGA / UXGA

On Aug 28, 9:52*am, "Calab" wrote:
"Dennis" wrote in message

...





I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080


I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc


ie: *http://www.infinitecables.com/vga.html


The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? *Can
any VGA cable handle high resolutions?


VGA is VGA. Unless you are going LOOOONG distances, you won't see a
difference.

You would probably see a difference if you switched to DVI/HDMI though,
since this is a digital signal.- Hide quoted text -

- Show quoted text -



Can anyone confirm:

(1) It makes no difference which kind of VGA cable you use
(2) But using DVI/HDMI cable you would notice a difference vs VGA

Agree?
  #5  
Old August 28th 08, 07:45 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.hardware,24hoursupport.helpdesk
Benjamin Gawert
external usenet poster
 
Posts: 1,020
Default 1080p LCD to PC, cabling: VGA / SVGA / UXGA

* Dennis:

(1) It makes no difference which kind of VGA cable you use


No, it doesn't make any difference if your monitor is just **** or if
you have an eyesight like Stevie Wonder. For all other situations the
quality of the VGA cable does make a difference, and a even bigger one
at high resolutions (over 1280x1024). Analog signals like VGA are quite
sensible to signal attenuation, xtalk and other funny effects that RF
signals usually suffer from. A crap cable will very likely lead to a
blurry picture missing sharpness, a good one won't.

But as with all analog signals, it also comes down on how good the
signal that comes from your gfx card is. Most newer gfx cards have an
average to awful analog signal quality because the manufacturers save a
few bucks on the analog output filters on the card.

(2) But using DVI/HDMI cable you would notice a difference vs VGA

Agree?


Yes.

But besides the cabling issue you should also check that your LCD TV
supports disabling of overscan as this leads to a blurry picture as
well. And no matter what the manuals say the HDMI inputs are capable of,
use the *native* resolution of your LCD panel. If your TV is "HD ready"
and uses a 1366x768 panel then use this resolution and not 1920x1080 as
the latter one will be downscaled which also looks blurry on desktop
applications.

Benjamin
  #6  
Old August 28th 08, 08:53 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.hardware,24hoursupport.helpdesk
Paul
external usenet poster
 
Posts: 13,364
Default 1080p LCD to PC, cabling: VGA / SVGA / UXGA

Dennis wrote:
On Aug 28, 9:52 am, "Calab" wrote:
"Dennis" wrote in message

...

I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080
I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc
ie: http://www.infinitecables.com/vga.html
The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?

VGA is VGA. Unless you are going LOOOONG distances, you won't see a
difference.

You would probably see a difference if you switched to DVI/HDMI though,
since this is a digital signal.- Hide quoted text -

- Show quoted text -



Can anyone confirm:

(1) It makes no difference which kind of VGA cable you use
(2) But using DVI/HDMI cable you would notice a difference vs VGA

Agree?


1) The old scheme for PC monitors, used status pins on the interface,
to help the monitor signal a particular resolution to the PC. Apple
used a similar scheme, on their video interfaces.

http://www.monitorworld.com/faq_pages/q17_page.html

2) Modern display devices use a DDC interface. That is a serial digital
interface, with signal names like SCL and SDA. One signal is a clock
and the other signal carries data.

http://martin.hinner.info/vga/pinout.html

3) If there are working SCL and SDA pins and wires on the cable,
then a utility like this one, can display what the monitor
is sending to the computer (the info is used by the video driver).

http://www.entechtaiwan.com/util/moninfo.shtm

If you go to the store today, and buy a VGA cable, at a minimum it
should have RGBHV to handle the analog video signals and sync
signals. It should have the two wires for SCL and SDA. Those
support the basics.

There are cables, where there is no room for SCL and SDA. For example,
my old CRT monitor, uses one of these cables. It was a beautiful
monitor in its time, a Sony Trinitron, supported multisync, but
since it used this cable, there was no plug and play with this.
Your situation is unlikely to be using a cable like this.
Cabling schemes like this, may continue to be used on
projection TV devices.

http://www.monitorworld.com/m_images...tovgaphoto.jpg

Modern VGA, DVI, and HDMI all have DDC serial interfaces on them,
and that means that the cable design can be generic. There shouldn't
be several different flavors of VGA cable needed now.

HDMI - SCL and SDA pins are listed.

http://en.wikipedia.org/wiki/Hdmi

DVI - In the picture here, the signals are called DDC Clock and DDC Data

http://en.wikipedia.org/wiki/Digital_Visual_Interface

VGA - In this article, there is no mention of the older "sense" definitions
of the pins. Just the SCL and SDA.

http://en.wikipedia.org/wiki/VGA_connector

As for signal quality, the failings of the cables happen different ways.

On VGA (analog), the signal environment is 75 ohm coax for the RGB.
If there are reflections, problems detecting the sync signals cleanly,
there can be visual artifacts, like ghosting, or a blurry picture.
And, as the cable becomes longer, then only the lower monitor resolutions
are sharply rendered. So if I run a 100 foot VGA cable, I might only
get a sharp picture at 1024x768. If the cable is 6 feet long, perhaps
I can run 1600x1200 and expect it to be nice and sharp.

DVI and HDMI use the same method of transmission (the interface standards
have a lot in common). The interface is digital and is differential
(uses two wires D+ and D-, to carry one information stream). Now, as you make
the cable longer on that one, at first there is no visual artifact at all.
Digital doesn't degrade if the signal path is working in good conditions.
So at least small increases in cable length have no effect on
resolution choices or anything else for that matter.

But as the cable gets longer, the digital signal is attenuated by the
run of cable. Eventually the signal is too small to be detected well.
What appears on the screen is "colored snow". Each miscolored dot on
the screen represents a digital transmission error, due to poor
signal quality. The display will be "sparkly", because the position
of each errored dot is rather random.

Sometimes the snow happens even with a short cable. Some people have
bought new equipment, and the cheap *******s include an inferior
HDMI cable with it. Changing the cable to a higher quality one,
may fix the snow problem (of course, the price you pay for the
replacement cable, may make your jaw drop). The Wikipedia HDMI
article suggests what range of cable lengths can be reasonably
expected to work.

With HDMI and DVI, the resolution and refresh rate, affect the
data rate sent digitally across the cable. So in fact, when
"snow" happens, it is possible that the resolution and refresh
rate will affect the snow. To give you some idea just how fast
those signals are, when someone says the "clock" on the cable
is 165MHz, the differential data is actually going across the
cable at 1650Mbit/sec (ten bits are sent per "clock cycle").
Effectively about as fast as SATA I, only involving longer cables.

Paul
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
inspiron 5100 upgrade screen to uxga? Kuba Dell Computers 0 February 9th 05 07:17 PM
Dim UXGA display on I5000e Mike Marquis Dell Computers 1 October 19th 04 02:52 PM
Dell XGA, SXGA, UXGA: can you really SEE a difference? gringo Dell Computers 1 April 29th 04 06:44 AM
Does Radeon 7000 support DVI at 1600 x 1200 UXGA Andreas Seeking Ati Videocards 0 February 14th 04 03:21 PM
Uxga screen test utility? Bob Regis Dell Computers 8 November 6th 03 01:46 AM


All times are GMT +1. The time now is 08:33 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.