A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

problems with long VGA cable



 
 
Thread Tools Display Modes
  #1  
Old December 17th 06, 02:33 PM posted to alt.comp.periphs.videocards.nvidia
Markus.Humm
external usenet poster
 
Posts: 3
Default problems with long VGA cable

Hello,

my brother recently bought a new PC where a GForce 7600 or so is
included. The display he uses is a Xerox 19" 1440x900 pixel LCD display,
OS is Windows XP Media Center.

When he uses the supplied VGA cable he can set the resolution to
1440x900 as desired and the display drivers install as they should.

If he replaces that cable with a longer one (3m, ferrites on both ends,
thicker than the original one and according to the spec. of this cable a
properly shielded one - as far as I can tell, it cost around 28 euros so
it wasn't cheap) he can't set the resulution to any higher than 1280x800
or so (if one keeps the correct aspect ration) even if the Xerox display
drivers are reinstalled. If Windows is told to show also modes the
display doesn't support he can use higher resolutions but not the native
one 1440x900 of the display (what should be used on LCD type displays
for clearness).

Why is this? Do the display and gfx-card do some thing of auto
negotiation? The cable misses one pin, namely the second left one of the
middle row of that 15 pin connector when the wider side is down. If I
look the pin up in the german wikipedia it says it's the ground pin for
the blue color? So how can this be? How does one even get a picture
without this pin?

The reason for using a longer cable than originally supplied is that the
display should just stand a bit more away from the PC and the original
one was maybe 50 centimerters too short.

Any hints/suggestions for me? He already looked for any customization
settings in the NVidia driver but didn't yet find something useable.

Another question: why did NVidia change it's driver's settings screen?

Greetings

Markus
  #2  
Old December 17th 06, 05:33 PM posted to alt.comp.periphs.videocards.nvidia
Pea
external usenet poster
 
Posts: 1
Default problems with long VGA cable

"Markus.Humm" wrote in
:

Hello,

my brother recently bought a new PC where a GForce 7600 or so is
included. The display he uses is a Xerox 19" 1440x900 pixel LCD
display, OS is Windows XP Media Center.

When he uses the supplied VGA cable he can set the resolution to
1440x900 as desired and the display drivers install as they should.

If he replaces that cable with a longer one (3m, ferrites on both
ends, thicker than the original one and according to the spec. of this
cable a properly shielded one - as far as I can tell, it cost around
28 euros so it wasn't cheap) he can't set the resulution to any higher
than 1280x800 or so (if one keeps the correct aspect ration) even if
the Xerox display drivers are reinstalled. If Windows is told to show
also modes the display doesn't support he can use higher resolutions
but not the native one 1440x900 of the display (what should be used on
LCD type displays for clearness).

Why is this? Do the display and gfx-card do some thing of auto
negotiation? The cable misses one pin, namely the second left one of
the middle row of that 15 pin connector when the wider side is down.
If I look the pin up in the german wikipedia it says it's the ground
pin for the blue color? So how can this be? How does one even get a
picture without this pin?

The reason for using a longer cable than originally supplied is that
the display should just stand a bit more away from the PC and the
original one was maybe 50 centimerters too short.

Any hints/suggestions for me? He already looked for any customization
settings in the NVidia driver but didn't yet find something useable.

Another question: why did NVidia change it's driver's settings screen?

Greetings

Markus


I have an LG 19" LCd monitor (1280x1024), when using the supplied cable,
it is identified as LG Electronics 1915s. If I use a cable the same as
you have with the same pin missing it becomes Analogue monitor. Although
i still get the same 1280x1024 resolution.

Maybe this missing pin is why you have the problem
  #3  
Old December 18th 06, 04:00 AM posted to alt.comp.periphs.videocards.nvidia
First of One
external usenet poster
 
Posts: 312
Default problems with long VGA cable

"Markus.Humm" wrote in message
...
If he replaces that cable with a longer one (3m, ferrites on both ends,
thicker than the original one and according to the spec. of this cable a
properly shielded one - as far as I can tell, it cost around 28 euros so
it wasn't cheap)


28 euros may or may not imply a premium cable depending on where your friend
bought it. Small accessories like cables tend to carry the highest retail
markups, especially in large electronics stores. In any case, 3 m is a
pretty long distance.

The cable misses one pin, namely the second left one of the middle row of
that 15 pin connector when the wider side is down. If I look the pin up in
the german wikipedia it says it's the ground pin for the blue color? So
how can this be? How does one even get a picture without this pin?


You probably have it backwards. Remember the male and female connectors are
mirror images. Second pin from the left, in the middle row, is "no
connection". Many cables have a pin missing there.

I suspect the longer cable just doesn't have the bandwidth, due to poor
construction. To verify this, set the color depth to 16-bit and drop the
refresh rate to 50 Hz. If 1440x900 suddenly becomes available, then the
cable is to blame and no amount of software tweaking will help you.

Any hints/suggestions for me? He already looked for any customization
settings in the NVidia driver but didn't yet find something useable.


If the cable is to blame, purchase a good, shielded DVI cable of the
required length. You get fully digital image quality, and preservation of
all the DDC/EDID communication between the PC and monitor.

If, however, the cable cannot be isolated to be the source of the problem,
try Powerstrip, which should let you set any custom resolution.

Another question: why did NVidia change it's driver's settings screen?


Because it was getting too cluttered for the "silent majority of stupid
users". ATi started the process with its bloated Catalyst Control Center.
Now nVidia has to follow suit...

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


  #4  
Old December 18th 06, 12:51 PM posted to alt.comp.periphs.videocards.nvidia
ShutEye
external usenet poster
 
Posts: 95
Default problems with long VGA cable

If he replaces that cable with a longer one (3m, ferrites on both ends,
thicker than the original one and according to the spec. of this cable a
properly shielded one - as far as I can tell, it cost around 28 euros so
it wasn't cheap)


28 euros may or may not imply a premium cable depending on where your
friend bought it. Small accessories like cables tend to carry the highest
retail markups, especially in large electronics stores. In any case, 3 m
is a pretty long distance.


I've a projector connected via a 15M vga cable doing 1280x720x32bpp@60Hz no
problems.
So 3M really should be possible even with a higher rez.
In fact a DVI (digital signal) cable the same length failed to complete the
same task.

The cable misses one pin, namely the second left one of the middle row of
that 15 pin connector when the wider side is down. If I look the pin up
in the german wikipedia it says it's the ground pin for the blue color?
So how can this be? How does one even get a picture without this pin?


You probably have it backwards. Remember the male and female connectors
are mirror images. Second pin from the left, in the middle row, is "no
connection". Many cables have a pin missing there.

I suspect the longer cable just doesn't have the bandwidth, due to poor
construction. To verify this, set the color depth to 16-bit and drop the
refresh rate to 50 Hz. If 1440x900 suddenly becomes available, then the
cable is to blame and no amount of software tweaking will help you.


Agreed. Try another (better) cable.

Any hints/suggestions for me? He already looked for any customization
settings in the NVidia driver but didn't yet find something useable.


If the cable is to blame, purchase a good, shielded DVI cable of the
required length. You get fully digital image quality, and preservation of
all the DDC/EDID communication between the PC and monitor.


Maybe. Read my reply further up ...

If, however, the cable cannot be isolated to be the source of the problem,
try Powerstrip, which should let you set any custom resolution.

Another question: why did NVidia change it's driver's settings screen?


Just change it back to 'classic' view.

Because it was getting too cluttered for the "silent majority of stupid
users". ATi started the process with its bloated Catalyst Control Center.
Now nVidia has to follow suit...


I hope not!


  #5  
Old December 18th 06, 08:07 PM posted to alt.comp.periphs.videocards.nvidia
Markus.Humm
external usenet poster
 
Posts: 3
Default problems with long VGA cable

First of One schrieb:
"Markus.Humm" wrote in message

[snip]

The cable misses one pin, namely the second left one of the middle row of
that 15 pin connector when the wider side is down. If I look the pin up in
the german wikipedia it says it's the ground pin for the blue color? So
how can this be? How does one even get a picture without this pin?


You probably have it backwards. Remember the male and female connectors are
mirror images. Second pin from the left, in the middle row, is "no
connection". Many cables have a pin missing there.


Okay.


I suspect the longer cable just doesn't have the bandwidth, due to poor
construction. To verify this, set the color depth to 16-bit and drop the
refresh rate to 50 Hz. If 1440x900 suddenly becomes available, then the
cable is to blame and no amount of software tweaking will help you.


Maybe it can't keep up with the bandwidth, but the questions then a

1. how to set the display to 50 Hz? You normally don't have this
setting available

2. the desired screen resolution isn't available, it can't be set. Who
determines what can be set and what not? Who caused the change of the
installed display type in Windows just after switching cables?
Is there any measurement circuit in the display or graphics card
which can dynamically determine the maximum tolerable bandwidth and
thus restrict the available resolutions?

Any hints/suggestions for me? He already looked for any customization
settings in the NVidia driver but didn't yet find something useable.


If the cable is to blame, purchase a good, shielded DVI cable of the
required length. You get fully digital image quality, and preservation of
all the DDC/EDID communication between the PC and monitor.


A DVI cable won't be of much use, since both sides don't have a DVI
port. This is simple consumer type PC equipment.


If, however, the cable cannot be isolated to be the source of the problem,
try Powerstrip, which should let you set any custom resolution.


What is powerstrip? Some sort of driver?

Another question: why did NVidia change it's driver's settings screen?


Because it was getting too cluttered for the "silent majority of stupid
users". ATi started the process with its bloated Catalyst Control Center.
Now nVidia has to follow suit...


Yes and no. There is still the possibility to use the old one but it
says something about legal stuff...

Greetings

Markus
  #6  
Old December 19th 06, 06:38 AM posted to alt.comp.periphs.videocards.nvidia
First of One
external usenet poster
 
Posts: 312
Default problems with long VGA cable

"Markus.Humm" wrote in message
...
1. how to set the display to 50 Hz? You normally don't have this
setting available


50 Hz *should* be there if you uncheck "hide modes that my monitor can't
display".

Is there any measurement circuit in the display or graphics card
which can dynamically determine the maximum tolerable bandwidth and
thus restrict the available resolutions?


Maybe. :-) Keep in mind the communication is two-way between the video card
and monitor. It may well be that whatever's being transmitted from the
monitor via the I2C pins are being attenuated by the long cable.

A DVI cable won't be of much use, since both sides don't have a DVI port.
This is simple consumer type PC equipment.


All the 19" widescreen LCDs at Xerox Europe's web site have a DVI port.
Most, if not all Geforce 7600 cards have a DVI port. Yet you tell me you
can't use a DVI cable?

What is powerstrip? Some sort of driver?


The program Powerstrip, made by Entech, is a "universal" utility that works
with most video cards. It lets you customize resolutions, set up hotkeys,
overclock, etc.

Yes and no. There is still the possibility to use the old one but it says
something about legal stuff...


What legal stuff? And why should you care? I know overclocking was
considered immoral in Germany back when Tom Pabst first started his hardware
publication. :-) I thought there has been much progress in the last decade.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."




  #7  
Old December 19th 06, 08:07 PM posted to alt.comp.periphs.videocards.nvidia
Markus.Humm
external usenet poster
 
Posts: 3
Default problems with long VGA cable

First of One schrieb:
[snip]

A DVI cable won't be of much use, since both sides don't have a DVI port.
This is simple consumer type PC equipment.


All the 19" widescreen LCDs at Xerox Europe's web site have a DVI port.
Most, if not all Geforce 7600 cards have a DVI port. Yet you tell me you
can't use a DVI cable?


That may as well be as this particular PC and Xerox 19" Flattscreen do
not have a DVI port. In the box of the flatscreen lay a short notice
stating explicitely that there is not DVI port even if the manual would
tell otherwise! So it was...

The NVidia card is in a Packard Bell Intel Dual Core PC. Packard Bell is
NEC's consumer PC brand.


What is powerstrip? Some sort of driver?


The program Powerstrip, made by Entech, is a "universal" utility that works
with most video cards. It lets you customize resolutions, set up hotkeys,
overclock, etc.


Okay.


Yes and no. There is still the possibility to use the old one but it says
something about legal stuff...


What legal stuff?


I didn't refer to overclocking or such things, I rather seem to remember
that the NVidia driver said something like this when presenting the
dialog to choose the GUI to be used.

Greetings

Markus
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
How can I tell if I have a cable select computer or cable? DJW Storage (alternative) 2 December 2nd 06 07:41 PM
How to connect 7-in-1 floppydrive/smartcard reader ? (asus a8n32x-sli) Skybuck Asus Motherboards 8 April 14th 06 08:52 PM
Problems putting HDD and any other device on the same IDE cable Dan General Hardware 2 September 12th 04 05:47 PM
All in Wonder long cable run question--RF or composite cable? JPE Ati Videocards 2 October 16th 03 09:36 PM
cutting psu wires Pen General 4 July 27th 03 07:49 PM


All times are GMT +1. The time now is 02:12 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.