Unstable Display Settings.

W

wei

I have three desktops connected to a DVI KVM. One and only one of
them consistently changes its screen settings to something unusable
whenever I use the KVM toggles to a different desktop and back again.
I have to reset the settings manually or re-boot that machine, to set
things right again.

I have been suspecting KVM failure, and still do, really. But I just
wonder.....Could it be the fault of that machine's video card? Or the
machine itself? Just wondering.

Wei
 
P

Paul

I have three desktops connected to a DVI KVM. One and only one of
them consistently changes its screen settings to something unusable
whenever I use the KVM toggles to a different desktop and back again.
I have to reset the settings manually or re-boot that machine, to set
things right again.

I have been suspecting KVM failure, and still do, really. But I just
wonder.....Could it be the fault of that machine's video card? Or the
machine itself? Just wondering.

Wei

Video cards do two things.

1) Sense monitor presence via impedance. If the electrical load
at the end of the line "glitches", then the video card will
assume the monitor is unplugged, and disable the video output.
When it sees the load impedance present again, it can enable
the video output (especially if that was the only monitor on the
system).

2) The OS and video card, query the monitor via the serial DDC bus
on the connector. That allows reading the EDID EEPROM of the monitor,
and seeing what resolutions are supported. Historically, the
video driver is not allowed to drive higher than a certain
resolution, unless obtaining information first, that says the
monitor actually supports that resolution. More than 20 years
ago, you could damage a monitor by using too high a resolution
or refresh rate.

While a KVM could be designed to be completely seamless, by
faking all the necessary responses, it's simply cheaper for them
not to do that. And just let the thing glitch, and rely on the
automated responses to return things to normal.

Paul
 
W

wei

Video cards do two things.

1) Sense monitor presence via impedance. If the electrical load
at the end of the line "glitches", then the video card will
assume the monitor is unplugged, and disable the video output.
When it sees the load impedance present again, it can enable
the video output (especially if that was the only monitor on the
system).

2) The OS and video card, query the monitor via the serial DDC bus
on the connector. That allows reading the EDID EEPROM of the monitor,
and seeing what resolutions are supported. Historically, the
video driver is not allowed to drive higher than a certain
resolution, unless obtaining information first, that says the
monitor actually supports that resolution. More than 20 years
ago, you could damage a monitor by using too high a resolution
or refresh rate.

While a KVM could be designed to be completely seamless, by
faking all the necessary responses, it's simply cheaper for them
not to do that. And just let the thing glitch, and rely on the
automated responses to return things to normal.

Paul

Except that it is not automated. I have to manually use Control
Panel>Display>Settings to get the right settings back. Gets tiresome.
Oh well.
XieXie
Wei
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads

VGA - DVI - KVM Dilemna 3
Dell Optiplex 760 Problem 1
XP won't save monitor settings 1
Chkdsk question 2
Whats causing this??? 14
KVM Switch 2
IOGEAR DVI KVM Problem? 3
Monitor fed by VGA and DVI-D thru KVM 10

Top