KVM switch for DVI

K

kony

Benjamin said:
doesn't really fit to DVI because the TMDS signalling used in PanelLink
communications (the technology that is used in DVI) is more than the
plain transmission of a few bits. PanelLink uses quite complex data
words with ecc schemes which makes it even more robust.


Then to a certain extent I was wrong, but it still does NOT prevent
bit-flipping... even words are still subject to signal levels. It
would only cause resend but even then, it will require the bandwidth to
do it. If it cannot display the image in realtime terms from resends,
it is still a degradation.
 
B

Benjamin Gawert

* kony:
Benjamin Gawert wrote:
Then to a certain extent I was wrong, but it still does NOT prevent
bit-flipping... even words are still subject to signal levels. It
would only cause resend but even then, it will require the bandwidth to
do it. If it cannot display the image in realtime terms from resends,
it is still a degradation.

Again, you are thinking too basic. There simply is no "Bit flipping" in
cables. There only is a certain amount of noise and signal deformation
because of the cable properties (R-L-C combination). Even if there would
be "bit flipping" the whole data word wouldn't have to be resent. There
is no lag, there is no image quality degradation, there is nothing
visible, period. If you increase DVI cable length the image remains
totally perfekt until the TMDS controller is not able to extract data
any more and the image gets extremely distorted (massive artifacts) or
there isn't any image at all. With analog you can see the image getting
worse and worse when increasing the cable length.

Benjamin
 
B

Bob Myers

Benjamin Gawert said:

That would come as a surprise to the people (including myself)
who worked on the DVI spec. DVI does NOT include a parity
bit, checksum, or other error-detection/correction functionality
within the data stream, nor is there any provision for error handling
on the link. In short, should the signal degrade to the point where
errors are being introduced into the data (but the receivers can still
maintain synchronization and data decoding), there is no way for
the receiver to detect this.

However, it should be noted that display applications are generally
pretty forgiving when it comes to the occasional error, whether that
comes in via analog OR digital encoding of the video.

"Bit flipping" as you call it doesn't happen over cables. What happens,
though is that there are various types of influences (crosstalk, wave
effects, reflections, irradiation etc) that are effective in cables (and
any other type of signal transmission). One of the advantages of digital
transmission over analog transmission however is that digital
transmission is way more robust to these influences than analog
transmissions.

Yes and no. As I've said here before, there is really NO difference
between "analog" and "digital" when it comes to robustness in the presence
of noise IF THESE ARE COMPARED AT COMPARABLE DATA
RATES. There's simply no way to get around the Gospel According To
St. Shannon, which sets an absolute and unavoidable limit on the amount
of useful information you can get through any physical channel in the
presence
of a given amount of noise. What most people mean by a "digital"
transmission - wherein a given physical line carries a serial stream of
amplitude-encoded binary - is simply an example of trading off data
rate for noise margin, and that's all it is.

Since unlike with analog
transmission the data integrity remains constant over a certain area of
noise level the image quality also remains constant, no matter if the
cable is say 1m or 5m or if there is a KVM in the line or not.

However, additional "noise" (meaning anything in the signal which
is not "signal," including all forms of noise, distortion, etc.) is
unavoidable for longer cable runs. You WILL get a higher error
rate with longer cables, all else being equal, and that's unavoidable.
Your very basic thinking of degradation because of "flipping bits"
doesn't really fit to DVI because the TMDS signalling used in PanelLink
communications (the technology that is used in DVI) is more than the
plain transmission of a few bits. PanelLink uses quite complex data
words with ecc schemes which makes it even more robust.

Nope. PanelLink(TM) uses a proprietary 8-to-10-bit encoding
scheme whose primary functions are to DC balance the line and
minimize the number of transitions (and therefore signal-induced
noise) in the resulting data stream. It most definitely does NOT
add anything in the way of robustness in terms of the error rate.

Bob M.
 
B

Bob Myers

Benjamin Gawert said:
Again, you are thinking too basic. There simply is no "Bit flipping" in
cables. There only is a certain amount of noise and signal deformation
because of the cable properties (R-L-C combination). Even if there would
be "bit flipping" the whole data word wouldn't have to be resent. There
is no lag, there is no image quality degradation, there is nothing
visible, period. If you increase DVI cable length the image remains
totally perfekt until the TMDS controller is not able to extract data
any more and the image gets extremely distorted (massive artifacts) or
there isn't any image at all. With analog you can see the image getting
worse and worse when increasing the cable length.

Which is really just another way of saying that "analog" encoding
of information (here's where I generally throw in the observation
that there's really no such thing as an analog or digital SIGNAL -
it's always just voltage variations or whatever on a wire, and what
matters is how we're supposed to interpret those) degrades
"more gracefully" in the presence of increasing noise levels than
does most "digital" encoding. This happens because with
"analog" encoding, the most important parts of the information
(the MSBs) are also those which are most resistant to noise, and
the LSBs are lost first. In the typical "digital" scheme (although not
ALL "digital" representations suffer from this), all bits are equally
vulnerable to noise, and the image continues to look pretty darn
good (although NOT perfect - the data stream does NOT have to
be error-free to look good) until you hit a brick wall in terms of
noise level, at which point everything goes to hell at once. An
analog-video system will often deliver a very usable image well
beyond the point at which noise has completely knocked out a
simple binary-encoded digital interface.

Of course, we also need to note that noise-induced errors in the
video signal are rarely a problem for either system, INCLUDING
analog; what is a much greater problem with current "analog video"
interfaces, when used with LCDs and similar fixed-pixel displays,
is not signal integrity but rather the ability of the receiving display
to generate a correct sampling clock. There's no pixel-level
synchronization information conveyed in analog video systems, so
synchronization/sampling (which is not really a "digital vs. analog"
issue) is a much bigger issue. Take the pixel clock away from any
"digital" interrface, and see how well IT does....:)

Bob M.
 
B

Benjamin Gawert

* Bob Myers:
That would come as a surprise to the people (including myself)
who worked on the DVI spec. DVI does NOT include a parity
bit, checksum, or other error-detection/correction functionality
within the data stream, nor is there any provision for error handling
on the link.

Then these people probably never heard of things like TERC4 (TDMS Error
Correction Code 4bit) and BCH ECC, right? ;-)

Due to the specification error correction is not mandatory to DVI but
it's still supported by most TMDS transmitters on common gfx cards and
in most monitors.

Benjamin
 
B

Bob Myers

Benjamin Gawert said:
* Bob Myers:


Then these people probably never heard of things like TERC4 (TDMS Error
Correction Code 4bit) and BCH ECC, right? ;-)

Heard of it, but methinks you're confusing your interface standards,
and not realizing which parts of the various standards this sort
of this DOES apply to.

TERC4 coding is recognized under the HDMI specification, which,
like DVI, is TMDS-based. However, under the good ol' There
Ain't No Such Thing As A Free Lunch rule, TERC4
requires 12 bits to be transmitted per pixel clock, not 10 - so
it's not usable under a purely DVI-specification-compliant
implementation, which permits only the 10-bit/clock encoding
to be used. And the original claim had to do with error
detection/correction under the DVI spec, NOT TMDS in general.
HDMI is sort-of DVI compatible, but they ARE two different
specifications. It's not surprising that a good number of
TMDS components provide TERC4 support these days, since
these very same components are often going to be used to
support HDMI - but strictly speaking, there is no provision
for this sort of thing under the DVI specification. Under HDMI,
even, TERC4 (due to this 12-bit requirement) is used only during
the "Data Island" period, which carries the audio samples and
auxiliary data such as CEA-861 InfoFrames. It does not apply
to the active video data sent during the normal video data period,
which was also implicitly part of your original assertion. There's
simply no capacity for this sort of thing (including BCH or some
other form of ECC) there - as noted earlier, the 8-to-10 bit
encoding that IS used is already busy with the DC balancing
and transition-minimization chores.
Due to the specification error correction is not mandatory to DVI but
it's still supported by most TMDS transmitters on common gfx cards and
in most monitors.

Let us all pause for a moment to reflect on the fact that
"supported in a particular component" does NOT always
translate to "used in a particular implementation or for all
data."

Bob M.
 
K

kony

* kony:


Nope. With analog transmission image quality is directly proportional to
the various influences over the transmission line. With digital, it's not.

I never wrote "image quality", you are assuming more than
written. Degradation can be other things, for example frame
updates... if the data is resent it can't then be *current*
unless there was ample (excess) bandwidth. So unlike analog
where every pixel might be a little blurred towards adjacent
values, even single one might remain wrong as the prior
frames.


Right. So what? Of course you can't use as much extension cords as you
want with DVI. But that doesn't change a thing on the fact that digital
transmission unlike analog transmission has no image quality degradation
due to longer cables or the fact that there is a KVM switch in the line.

Depends on how you define degradation. "IF" the pixels get
there in time, yes they will be accurate.


Right, the _signal_quality_ degrades. But that doesn't mean the
_image_quality_ degrades, too.

Yes it does, if you don't take only that very narrow view of
what image quality is. Seeing timely information on screen
is in itself an aspect of image quality.
 
K

kony

* kony:


Again, you are thinking too basic. There simply is no "Bit flipping" in
cables. There only is a certain amount of noise and signal deformation
because of the cable properties (R-L-C combination). Even if there would
be "bit flipping" the whole data word wouldn't have to be resent. There
is no lag, there is no image quality degradation, there is nothing
visible, period. If you increase DVI cable length the image remains
totally perfekt until the TMDS controller is not able to extract data
any more and the image gets extremely distorted (massive artifacts)


LOL, so now you're saying there IS degradation even of the
type you disputed earlier.


or
there isn't any image at all. With analog you can see the image getting
worse and worse when increasing the cable length.

.... and I never claimed otherwise, except that same thing
happens with digital in a different way, the problematic
length of cable is just reached more abruptly instead of
gradually.
 
T

toronado455

Question: You guys are talking about signal degradation being caused
(or not caused) by the DVI cables, but I would think that the
electronics in the DVI KVM switch would be the more likely cause of any
signal or image quality degradation. Am I wrong?
 
K

kony

Question: You guys are talking about signal degradation being caused
(or not caused) by the DVI cables, but I would think that the
electronics in the DVI KVM switch would be the more likely cause of any
signal or image quality degradation. Am I wrong?

Either could be problematic. I doubt the KVM has any kind
of repeater functionality rather than just digital switching
so it could be seen as a need for additional length of cable
or worse if it has a poor design that subjects the signal to
further degradation... it all adds up, but that's not to
imply KVMs are usually a problem, but rather having one
doesn't mean the user can then string cables across their
house with it.
 
C

chrisv

kony said:
LOL, so now you're saying there IS degradation even of the
type you disputed earlier.

LOL, what he said did not contradict. You lost credibility, son. LOL
 
K

kony

LOL, what he said did not contradict. You lost credibility, son. LOL

"image gets extremely distorted" is clearly the opposite of
"thus there is no image degradation when using a KVM switch
or extension cables"

Both direct quotes, the latter started the disagreement.

Thanks for trolling by though.
 
B

Bob Myers

toronado455 said:
Question: You guys are talking about signal degradation being caused
(or not caused) by the DVI cables, but I would think that the
electronics in the DVI KVM switch would be the more likely cause of any
signal or image quality degradation. Am I wrong?

Could go either way - it would depend on just what
"electronics" would be in a given KVM switch. If it
were to have back-to-back TMDS receiver and transmitter
circuits, the chances are it would actually result in an
improvement vs. just trying to run a single cable the
whole length. It would be relatively easy to switch
received-and-decoded digital video essentially
"losslessly," then re-encode and transmit it as a now
cleaned, rested, and refreshed TMDS set of signals.

Bob M.
 
T

toronado455

From what I can tell there doesn't seem to be much difference between
the ATEN CS-1764 that Benjamin has and the IOGear GCS-1764. Even the
model numbers are basically the same - clones. Anyone have a preference
between these two?
 
C

chrisv

kony said:
"image gets extremely distorted" is clearly the opposite of
"thus there is no image degradation when using a KVM switch
or extension cables"

Both direct quotes, the latter started the disagreement.

Suffering from reading comprehension problems? He clearly said
(paraphrasing) "no image degradation UP TO A POINT". Read again
<[email protected]> and stop embarrassing yourself.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top