HardwareBanter

HardwareBanter (http://www.hardwarebanter.com/index.php)
-   Homebuilt PC's (http://www.hardwarebanter.com/forumdisplay.php?f=36)
-   -   How GT 520 auto detect VGA,HDMI,DVI device ? Is there manual override ? (http://www.hardwarebanter.com/showthread.php?t=199560)

[email protected] May 6th 19 10:14 PM

How GT 520 auto detect VGA,HDMI,DVI device ? Is there manual override ?
 
How does the GT 520 auto-detect the connected the device ?

It has 3 connectors: VGA, HDMI and DVI.

What I want to do is:

1. Use VGA always for monitor.
2. Use HDMI only for audio connection to receiver.

Problem with this setup is that GT 520 believes receiver to be a monitor and switches to it automatically on boot, with different/weird results, either no screen, or bad screen.

Only pulling out HDMI cable and rebooting will restore screen to VGA.

I have not yet tried the following solution:

1. Use DVI for monitor.
2. Use HDMI for receiver.

What would happen in this scenerio ?

I could not perform this experiment cause I didn't think of it at the time and the nvidia driver installation failed, complaining about some wizard already running after reboot which was kinda weird.

I installed an older driver over a newer driver (gt 1030).

I have now replaced the GT 520 by the GT 1030 and will attempt to re-install latest GT 1030 driver.

Hopefully this time it will work.

I would still like to know how GT 520 does auto-detect and if it's somehow possible to force it to always use VGA or always use DVI... instead of having to revert to physical solutions like yanking/plugging high-powered hdmi cables and rebooting.

For now I assume this forcing/manual selection of display device is not possible but please enlightening me if I am wrong.

(System uses a socket 939 winfast motherboard which only supports one graphics card in normal mode unfortunately... would have liked to use both cards for experimenting purposes and maybe even cuda-ing sometime in future.... perhaps it's the build in sli-card-link thing that causes this not sure).

Bye,
Skybuck.

(Posted this on nvidia forum too, perhaps this posting is more clear and shorter and easier to find with google ;))


Paul[_28_] May 7th 19 07:18 AM

How GT 520 auto detect VGA,HDMI,DVI device ? Is there manualoverride ?
 
wrote:
How does the GT 520 auto-detect the connected the device ?

It has 3 connectors: VGA, HDMI and DVI.

What I want to do is:

1. Use VGA always for monitor.
2. Use HDMI only for audio connection to receiver.

Problem with this setup is that GT 520 believes receiver to be a monitor and switches to it automatically on boot, with different/weird results, either no screen, or bad screen.

Only pulling out HDMI cable and rebooting will restore screen to VGA.

I have not yet tried the following solution:

1. Use DVI for monitor.
2. Use HDMI for receiver.

What would happen in this scenerio ?

I could not perform this experiment cause I didn't think of it at the time and the nvidia driver installation failed, complaining about some wizard already running after reboot which was kinda weird.

I installed an older driver over a newer driver (gt 1030).

I have now replaced the GT 520 by the GT 1030 and will attempt to re-install latest GT 1030 driver.

Hopefully this time it will work.

I would still like to know how GT 520 does auto-detect and if it's somehow possible to force it to always use VGA or always use DVI... instead of having to revert to physical solutions like yanking/plugging high-powered hdmi cables and rebooting.

For now I assume this forcing/manual selection of display device is not possible but please enlightening me if I am wrong.

(System uses a socket 939 winfast motherboard which only supports one graphics card in normal mode unfortunately... would have liked to use both cards for experimenting purposes and maybe even cuda-ing sometime in future.... perhaps it's the build in sli-card-link thing that causes this not sure).

Bye,
Skybuck.

(Posted this on nvidia forum too, perhaps this posting is more clear and shorter and easier to find with google ;))


Around page 15 of the manual.

"monitor"
520 ---HDMI--- receiver --- HDMI-out --- HDMI-to-VGA-adapter --- VGA monitor.
cable Denon cable dongle $30

If the receiver has HDMI-out, you can pull a signal for the
computer monitor from that. The HDMI-out on the monitor,
must be able to source sufficient power to run a HDMI-to-VGA-adapter
dongle.

The Denon 1909 appears to have strange ideas as to resolution
choices on HDMI-out, so there is no guarantee this will work.
But at least the plumbing is there for it. If your computer
monitor happens to be 1920x1080 VGA, this would be an ideal
accident. Other choices of native resolution on the computer
monitor, might not work so well, as the "scaler" in the Denon
is intended for Bluray/DVD players and the like. It wasn't
designed for computer-type HDMI signals.

In a "home theater" setup, the HDMI goes to the receiver
first, and then the video signal can go to a big screen TV.
I see no mention of 4K in the manual, so it would appear
the Denon 1909 is from the 1080p HD era.

I have the manual, from a previous question, sitting in my collection.

This is one of the reasons I keep a HDMI-to-VGA dongle here
as well as a DisplayPort-to-VGA dongle, because you never
know when some newer technology in the house will require it.

Paul

root[_6_] May 7th 19 04:30 PM

How GT 520 auto detect VGA,HDMI,DVI device ? Is there manualoverride ?
 
Paul wrote:
The Denon 1909 appears to have strange ideas as to resolution
choices on HDMI-out, so there is no guarantee this will work.
But at least the plumbing is there for it. If your computer
monitor happens to be 1920x1080 VGA, this would be an ideal
accident. Other choices of native resolution on the computer
monitor, might not work so well, as the "scaler" in the Denon
is intended for Bluray/DVD players and the like. It wasn't
designed for computer-type HDMI signals.

In a "home theater" setup, the HDMI goes to the receiver
first, and then the video signal can go to a big screen TV.
I see no mention of 4K in the manual, so it would appear
the Denon 1909 is from the 1080p HD era.

I have the manual, from a previous question, sitting in my collection.

This is one of the reasons I keep a HDMI-to-VGA dongle here
as well as a DisplayPort-to-VGA dongle, because you never
know when some newer technology in the house will require it.

Paul


I come late to this thread so I am unware of the issues.
My arrangement is HDMI computer-DenonAVR3313CI-LG 65" 4K oled
while also having the vga output of the graphics card to
a second monitor.

I have experimented with a number of Nvidia graphics cards
including 210, 610, 710, 1030 from different manufacturers.
Most recently I have tried the PNY 710 and 1030 from Best Buy.

Regardless of who makes the graphics card, if only one output
is connected the card will always direct output at boot
to that card. If both VGA and HDMI connections are made
to the graphics card, different manufacturers video cards
act differently, but most of those that I have tried will
display the boot sequence on the HDMI output only. It is
at boot time that the graphics card determines the characteristics
of the display.

I use linux for the computer system, and if Twinview is enabled
in the xorg.conf file both VGA and HDMI outputs will be displayed
when X is started. The resulting HDMI display will differ
according to whether the HDMI was connected at boot time.

My Denon receiver has a setup option ( I think it is HDMI control)
that affects how the HDMI signal is processed. My Denon will
upgrade an incoming signal to 4K if the display is 4K.

I apologize if this information is off topic.



All times are GMT +1. The time now is 12:34 AM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
HardwareBanter.com