Thread: DVI ?
View Single Post
  #4  
Old April 9th 17, 02:49 AM posted to alt.comp.hardware
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default DVI ?

philo wrote:
On 04/08/2017 07:32 PM, Paul wrote:
philo wrote:
I mentioned about a week ago I built a new machine using an Asrock
J3455M mobo.

Got it all working using Win7 and a standard VGA output.


I then decided to use the DVI-D output rather than VGA and get no
signal at all.

I am using a DVI-D cable and confirmed it works because I can put in a
(junkbox) PCIe video card and it will drive the monitor.

My monitor is an old Sceptre X9S-NAGA which specifies simply DVI or
VGA input.

Is the monitor too old/ non-compliant in some way or is the port on
the mobo possibly bad?


Well, you know what the theory is. That the DVI port should
continue to be compatible.

Even if the Sceptre didn't have an EDID, both ends should be
willing to do 640x480 or 800x600. It should light up, even if
it doesn't run at native resolution.

Paul



It must just be too old,. I tried another monitor and it works fine...I
just wanted to make sure the mobo is not defective.

I will probably run a PCIe video card anyway.


That Sceptre was really a great monitor in it's day but it's got to be
ten or 15 years old now


Here's a sample thread, of people having trouble.

https://communities.intel.com/thread/101571

There are a variety of theories as to why it isn't working.

And not much of a response from Intel. Sort of the quality
of answer you'd get in the Microsoft Answers forum from
some Microsoft contract answerer. The users have more theories
than Intel does.

Paul