A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Attenuation of Long DVI Cables



 
 
Thread Tools Display Modes
  #1  
Old December 26th 05, 11:10 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

Has anyone seen figures on how much loss of signal strength (attenuation) is
allowed for a DVI connection, for example between an X800 RADEON video card
and an LCD monitor? I'm looking for the "loss budget", or total
accumulated signal loss that is allowed between the video card and the LCD
display before the image quality is affected seriously.

Then I would like to know how much signal loss occurs per foot of cable for
a DVI cable and separately for an HDMI cable. Finally, what is the
additional signal loss if I convert one end of the cable through an adapter
from say HDMI to DVI.

--
Will


  #2  
Old December 27th 05, 03:52 AM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

Remember, a DVI cable is digital, not analog. As long as the display is
working, there will be no decrease in the quality of the image at all.
Only when the signal is attenuated so much that the bits and transitions
cannot be recovered will you have issues, and the symptom will not be
loss of image quality, but total loss of signal and image integrity.

HDMI and DVI are electrically identical as far as the video signals are
concerned. The only thing that an adapter changes is the connector.


Will wrote:

Has anyone seen figures on how much loss of signal strength (attenuation) is
allowed for a DVI connection, for example between an X800 RADEON video card
and an LCD monitor? I'm looking for the "loss budget", or total
accumulated signal loss that is allowed between the video card and the LCD
display before the image quality is affected seriously.

Then I would like to know how much signal loss occurs per foot of cable for
a DVI cable and separately for an HDMI cable. Finally, what is the
additional signal loss if I convert one end of the cable through an adapter
from say HDMI to DVI.

  #3  
Old December 27th 05, 09:04 AM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

Okay, that's a good point that the link either works or fails. But my
questions are still valid:

What is the typical loss budget for a connection between the video card and
LCD?

What are the typical attenuation losses per foot of cable?

What are the typical attenuation losses at the connectors, and for an
adapter that changes an HDMI over to DVI?

--
Will


"Barry Watzman" wrote in message
...
Remember, a DVI cable is digital, not analog. As long as the display is
working, there will be no decrease in the quality of the image at all.
Only when the signal is attenuated so much that the bits and transitions
cannot be recovered will you have issues, and the symptom will not be
loss of image quality, but total loss of signal and image integrity.

HDMI and DVI are electrically identical as far as the video signals are
concerned. The only thing that an adapter changes is the connector.


Will wrote:

Has anyone seen figures on how much loss of signal strength

(attenuation) is
allowed for a DVI connection, for example between an X800 RADEON video

card
and an LCD monitor? I'm looking for the "loss budget", or total
accumulated signal loss that is allowed between the video card and the

LCD
display before the image quality is affected seriously.

Then I would like to know how much signal loss occurs per foot of cable

for
a DVI cable and separately for an HDMI cable. Finally, what is the
additional signal loss if I convert one end of the cable through an

adapter
from say HDMI to DVI.



  #4  
Old December 27th 05, 12:16 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

In article , Will says...
Okay, that's a good point that the link either works or fails. But my
questions are still valid:

What is the typical loss budget for a connection between the video card and
LCD?

What are the typical attenuation losses per foot of cable?

What are the typical attenuation losses at the connectors, and for an
adapter that changes an HDMI over to DVI?


Why? Are you wanting the figures for some course you're doing or are
you one of those ****wits who believes the salesman when he says gold
plated optical cables are better than bog standard ones?

If not, you simply need to ask if XXX length will work.


--
Conor

"This is my Kung Fu and it is strong."
  #5  
Old December 27th 05, 04:30 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

The loss per foot is going to vary with the particular cable. The
general rule in my post on analog LCD monitors holds here also: The
quality of a cable is almost directly proportional to it's diameter, you
want a nice, fat cable. Unfortunately, as with analog cables, most of
the cables on the market are "cheap" ... I mean, if you need a 6 foot
cable and you have a choice (in a store or catalog) of one that is $5
and one that is $30 (and that can indeed be the magnitude of price
difference), which are you going to buy? Remember, you don't get to do
a side-by-side comparison, and you don't get to "see" the difference.

Please understand, I'm not in favor of "monster" cables or the hype of
"oxygen-free copper" or "gold plated connectors" (although gold plated
connectors, at least, ARE better). Many of the premium cables being
sold for all applications (audio, video, computer) are nothing more than
"snake oil" designed to separate people from their money. But in the
case of video cables carrying multiple signal lines of high-frequency
video ranging from 75 to 150 Megahertz square waves, there IS a
difference between good quality cables and cheap cables. Cheap cables
are small diameter, have a lot of capacitance and will round the edges
of square waves. They will also have reflections from improper
impeadance and termination, causing "ghosts" and "ringing". So video
cables are one place where paying for a premium cable may truly make a
difference. But this is much more true with analog connections than
with digital connections. With digital connections, if the cable losses
are excessive, the display just won't work, and in that case the option
to the buyer is clear: Return the cable. But with an analog
connection, the monitor does work, and the buyer may not even realize
how poor his image quality is compared to what it should be, or that the
"fuzziness" is caused by the cable.


Conor wrote:
In article , Will says...

Okay, that's a good point that the link either works or fails. But my
questions are still valid:

What is the typical loss budget for a connection between the video card and
LCD?

What are the typical attenuation losses per foot of cable?

What are the typical attenuation losses at the connectors, and for an
adapter that changes an HDMI over to DVI?



Why? Are you wanting the figures for some course you're doing or are
you one of those ****wits who believes the salesman when he says gold
plated optical cables are better than bog standard ones?

If not, you simply need to ask if XXX length will work.


  #6  
Old December 27th 05, 05:48 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

I understand that different cables yield different attenuations per foot.
I'd settle for some numbers on attenuation per foot based on high quality
thick cable.

And the loss budget from the card to the LCD will be a constant regardless
of cable quality.

This is a useful exercise because if you find, for example, that even the
best quality cable may exceed the loss budget after only 22 feet, then
probably 25 feet is pushing it. There wouldn't be any point in buying 50
feet in that case.

--
Will


"Barry Watzman" wrote in message
...
The loss per foot is going to vary with the particular cable. The
general rule in my post on analog LCD monitors holds here also: The
quality of a cable is almost directly proportional to it's diameter, you
want a nice, fat cable. Unfortunately, as with analog cables, most of
the cables on the market are "cheap" ... I mean, if you need a 6 foot
cable and you have a choice (in a store or catalog) of one that is $5
and one that is $30 (and that can indeed be the magnitude of price
difference), which are you going to buy? Remember, you don't get to do
a side-by-side comparison, and you don't get to "see" the difference.



  #7  
Old December 27th 05, 06:04 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

You have to assume that if someone is selling a 50 foot cable, it will
work. Again, it's a digital signal .... it either works or it doesn't,
and unlike an analog signal there is no quality degradation if it does
work. Not many companys will produce a product that has a very high
return rate, and any cables that don't work "out of the box" are likely
to get returned. So if you find a 500 cable (you won't, but just for
the sake of argument ....), presumably it can be expected to work.


Will wrote:

I understand that different cables yield different attenuations per foot.
I'd settle for some numbers on attenuation per foot based on high quality
thick cable.

And the loss budget from the card to the LCD will be a constant regardless
of cable quality.

This is a useful exercise because if you find, for example, that even the
best quality cable may exceed the loss budget after only 22 feet, then
probably 25 feet is pushing it. There wouldn't be any point in buying 50
feet in that case.

  #8  
Old December 27th 05, 06:28 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

You will need specs from the manufacturer of the connecting cable to answer
your questions about cable attenuation. Either the specific information for
the overall assambly (connector/cable/connector) or the manufacturer for
each component along with the manufacturer component identification which
you can then use to retrieve the attenuation and impedance numbers.

Take a look at a Belden Cable catalog. The frequency of the signal makes a
large difference in attenuation; differential phase change is can also be a
problem with broadband signals. There is no single simple answer to your
question. In other words, you will have to do some work.


Phil Weldon

"Will" wrote in message
...
|I understand that different cables yield different attenuations per foot.
| I'd settle for some numbers on attenuation per foot based on high quality
| thick cable.
|
| And the loss budget from the card to the LCD will be a constant regardless
| of cable quality.
|
| This is a useful exercise because if you find, for example, that even the
| best quality cable may exceed the loss budget after only 22 feet, then
| probably 25 feet is pushing it. There wouldn't be any point in buying 50
| feet in that case.
|
| --
| Will
|
|
| "Barry Watzman" wrote in message
| ...
| The loss per foot is going to vary with the particular cable. The
| general rule in my post on analog LCD monitors holds here also: The
| quality of a cable is almost directly proportional to it's diameter, you
| want a nice, fat cable. Unfortunately, as with analog cables, most of
| the cables on the market are "cheap" ... I mean, if you need a 6 foot
| cable and you have a choice (in a store or catalog) of one that is $5
| and one that is $30 (and that can indeed be the magnitude of price
| difference), which are you going to buy? Remember, you don't get to do
| a side-by-side comparison, and you don't get to "see" the difference.
|
|


  #9  
Old December 27th 05, 07:36 PM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

At this point I would settle for anyone publishing any figures for any cable
with a 1920x1080 resolution signal from a video card running under Windows
talking to a typical LCD monitor.

I'm not expecting those figures to be exactly my case, but I just want some
sense of the order of magnitude of the problem. I understand that there
are a lot of variables, but understanding that doesn't really help to get a
handle on the scale of what kind of loss budgets are realistic.

If we were talking fiber optic technology, I know that a typical loss budget
for a long run is around 25 db, and it's easy enough to build other facts
around that to know if you have any chance of making a run work. If you
know the loss budget is around 25 db, and you are looking at a long run with
a 50 db loss, then clearly that's not going to work out of the box.
Likewise, some manufacturers will claim they can span 35 db, and some might
be limited to 20 db, but no one is going to claim they can make up 250 db of
loss. So even starting with a single data point somewhere between 20 db
and 35 db puts some defined metric on your knowledge about the subject, and
it at least gives you some sense of scale.

For video cables running digital signals, I have NO sense of *scale*. I'm
not looking for precision. I'm looking for approximation and order of
magnitude.

--
Will


"Phil Weldon" wrote in message
nk.net...
You will need specs from the manufacturer of the connecting cable to

answer
your questions about cable attenuation. Either the specific information

for
the overall assambly (connector/cable/connector) or the manufacturer for
each component along with the manufacturer component identification which
you can then use to retrieve the attenuation and impedance numbers.

Take a look at a Belden Cable catalog. The frequency of the signal makes

a
large difference in attenuation; differential phase change is can also be

a
problem with broadband signals. There is no single simple answer to your
question. In other words, you will have to do some work.


Phil Weldon

"Will" wrote in message
...
|I understand that different cables yield different attenuations per foot.
| I'd settle for some numbers on attenuation per foot based on high

quality
| thick cable.
|
| And the loss budget from the card to the LCD will be a constant

regardless
| of cable quality.
|
| This is a useful exercise because if you find, for example, that even

the
| best quality cable may exceed the loss budget after only 22 feet, then
| probably 25 feet is pushing it. There wouldn't be any point in buying

50
| feet in that case.
|
| --
| Will
|
|
| "Barry Watzman" wrote in message
| ...
| The loss per foot is going to vary with the particular cable. The
| general rule in my post on analog LCD monitors holds here also: The
| quality of a cable is almost directly proportional to it's diameter,

you
| want a nice, fat cable. Unfortunately, as with analog cables, most of
| the cables on the market are "cheap" ... I mean, if you need a 6 foot
| cable and you have a choice (in a store or catalog) of one that is $5
| and one that is $30 (and that can indeed be the magnitude of price
| difference), which are you going to buy? Remember, you don't get to

do
| a side-by-side comparison, and you don't get to "see" the difference.
|
|




  #10  
Old December 28th 05, 12:18 AM posted to alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia
external usenet poster
 
Posts: n/a
Default Attenuation of Long DVI Cables

Straight from the horse's mouth:
http://www.ddwg.org/lib/dvi_10.pdf
http://www.ddwg.org/lib/DVI_TM_guide_REV1.pdf

Skimming through the documents, I think the standard calls for max loss/
jitter/overshoot, ideally measured with "eye-diagrams", but no attenuation
limit on a per-foot basis. It effectively limits most copper
*standard-compliant* DVI cables to 5 meters. Since the data rate is
extremely high, it's only meant for "local" use (think serial ATA cables).

For 20 m+ installations, you should be looking at alternative means, like
fiber optics. I found this one by Googling:
http://www2.dvigear.com/fiopca.html

Very expensive DVI fiber cables with fiber-optic converters on each end,
available in lengths from 10 meters ($485) to 100 meters ($2225).

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."

"Will" wrote in message
...
Has anyone seen figures on how much loss of signal strength (attenuation)
is
allowed for a DVI connection, for example between an X800 RADEON video
card
and an LCD monitor? I'm looking for the "loss budget", or total
accumulated signal loss that is allowed between the video card and the LCD
display before the image quality is affected seriously.

Then I would like to know how much signal loss occurs per foot of cable
for
a DVI cable and separately for an HDMI cable. Finally, what is the
additional signal loss if I convert one end of the cable through an
adapter
from say HDMI to DVI.

--
Will



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
ide cables maximum length 65cm or longer? William.R.Reisen Storage (alternative) 8 November 27th 04 03:25 AM
Best brands of round long ide cables? William.R.Reisen Storage (alternative) 4 November 27th 04 03:24 AM
IDE cables WayneM Homebuilt PC's 11 August 7th 04 03:18 AM
Good quality EIDE cables? Bob Storage (alternative) 13 April 15th 04 11:04 PM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Ati Videocards 12 February 24th 04 06:29 AM


All times are GMT +1. The time now is 12:50 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.