flat panel monitors

M

Mxsmanic

Bob said:
Bob, as much as I hate to disagree with you, I'm afraid
I'd have to vote "maybe" instead. For the most part, the
differences between an analog and a digital interface for
LCD monitors come down to questions of pixel timing,
which really have nothing at all to do with whether the
video information is in digital or analog form.

The best analog system will always beat the performance of the best
digital system. There's nothing about analog technology that makes it
intrinsically inferior to digital, so a good video card and a good
monitor should meet or beat any digital interface, I should think.

This is why the _best_ analog audio systems can consistently beat the
best digital systems. However, the superior performance comes at a
price that is usually all out of proportion with the increment of gain
over digital.
Oddly enough, the LCD is NOT inherently a "digital"
device as is often assumed - fundamentally, the control
of the pixel brightness in any LCD is an analog process.

Every interface between the digital world and the physical world is
analog, so all input and output devices are ultimately analog devices.
"Digital" only means something in the conceptual world of information
representation.
 
M

Mxsmanic

Bob said:
Well, that's the classic promise and peril of digital.
It's either as perfect as it ever gets, or it's not
there at all, whereas analog may never be perfect
enough, and opportunities for degradation abound.

Analog can also be more perfect than digital. In fact, it is always
possible to build an analog system that is superior to any given digital
system--if money is no object.
Umm, if the bits in the frame buffer are going thru a
DAC (which can introduce noise and distortion), then
thru a cable (which <ditto>), even if the LCD is not using
an ADC, and is using the analog signal directly, that
extra noise and distortion may show up on screen.

Sure, but the question is whether or not it actually does to any visible
extent in the real world.

I've found that, in many respects, PC video systems perform better than
they are supposed to. For all the noise one hears about the horrors of
analog systems, in real life they perform amazingly well. Look no
further than the continuing superiority of CRTs for most aspects of
image quality for proof.
I suspect it's irrelevant at this point. Analog is
the "economy" graphics connect now, and what we have
is sufficient for the market.

Economy perhaps, but that isn't always correlated with quality.
I think it more likely that the analog economy model
will be replaced by a digital economy model, where PC
main RAM is used for frame buffer, and the graphics
"card" (if any) is just a TMDS driver chip with a
DVI-D connector on the bulkhead, something like the
"ADD2" cards I see at <www.molex.com>.

I suspect the current "high-performance" digital models will become the
"digital economy" models, in time.
 
M

Mxsmanic

Bob said:
My guess is that because LCD subpixels are just barely
8-bit, a full correction might minimize color errors at
the expense of introducing visible terracing in gradients.

The incoming data might be 8-bit, but there's no reason why the internal
correction of the monitor can't be carried out with much higher
granularity.
 
B

Bob Niland

Mxsmanic said:
The best analog system will always beat the
performance of the best digital system.

Depending on how you define "best", as we saw with the
early debates about CD audio. Now that purists can get
48-bit 96 KHz digital audio, I don't see that debate
anymore.
Every interface between the digital world and the
physical world is analog, ...

Not at the quantum level.
Expect the physicists to sail in here and dispute that :)

Is anyone prepared to argue that using an HD15 analog
connection to an LCD monitor provides a "better" presentation?

It's conceivable, due to the anti-aliasing provided by the
analog blur. I was actually a bit startled by how crisp
the screen was using the DVI-D connection. In my CAD work,
I now always see stair-casing of angled and curved lines,
whereas on the CRT monitor (same res), they were smooth.
 
M

Mxsmanic

Bob said:
They are way behind in 3D perf, and only just
announced their first PCI-Express card.

But are they ahead in 2D performance and image quality? I have a
Millennium II card in my oldest PC, which as always served very well.
 
M

Mxsmanic

Bob said:
I was actually a bit startled by how crisp
the screen was using the DVI-D connection. In my CAD work,
I now always see stair-casing of angled and curved lines,
whereas on the CRT monitor (same res), they were smooth.

I doubt that this is a result of switching to a digital connection.

Note also that aliasing is usually a sign of lower resolution, not
higher resolution.
 
B

Bob Niland

But are they ahead in 2D performance and image quality? I have a
Millennium II card in my oldest PC, which as always served very well.

It depends on your applications, operating system,
PC, and graphics slot (AGP, PCI, PCI-X or PCIe).
You need to hit some forums devoted to your key
apps and get advice.

The two most graphics-intensive things I do, Photoshop
and IMSI TurboCAD, seem to get no particular benefit
from the accelerations available on ATI and Nvidia cards,
and perform quite adequately on a Matrox Parhelia.

Photoshop is compute and bus-bound.

TC uses OGL, but only for modes where performance isn't
an issue anyway. In fully-rendered mode, it's doing that
entirely in host software, and is purely compute-bound.

If I ran games, the config might have been different.
 
B

Bob Niland

I doubt that this is a result of switching to a digital connection.

Re-running the comparison, I see that it was partly due
to going digital, but mostly due to switching to LCD.
The former CRT (same res) was providing some additional
de-crisping :)
Note also that aliasing is usually a sign of lower
resolution, not higher resolution.

In this case, I'm making no changes to the video setup
when I switch between CRT and LCD, or analog and digital
on the LCD.

Just playing around in analog mode on the LCD, I see
not only the pink halo on black-on-white objects, but
also some ghosting (or ringing). Likely a result of the
KVM switch and extra cable in that path.

And painting a test pattern with alternating single-pixel
white-black, the white is not pure (but, impressively,
the alignment of the data and display rasters is perfect);
no gray moire.
 
M

Mxsmanic

Bob said:
Re-running the comparison, I see that it was partly due
to going digital, but mostly due to switching to LCD.
The former CRT (same res) was providing some additional
de-crisping :)

Remember that, in theory, there's no fixed upper limit to horizontal
resolution on a CRT, although the mask or grille spacing imposes some
practical restrictions. So you could be seeing additional detail on the
CRT that the LCD cannot display, in some cases.
Just playing around in analog mode on the LCD, I see
not only the pink halo on black-on-white objects, but
also some ghosting (or ringing). Likely a result of the
KVM switch and extra cable in that path.

It has to be distortion of the signal. The panel is just going to
sample the signal, so if there's a pink halo on the screen, there's one
in the signal.

I'm happy to say that I see no such artifacts on my screen. I just have
a simple 2-metre cable betwixt PC and panel (the cable supplied with the
panel).
And painting a test pattern with alternating single-pixel
white-black, the white is not pure (but, impressively,
the alignment of the data and display rasters is perfect);
no gray moire.

Maybe you just need to remove the switch and cable.
 
B

Bob Myers

The monitor knows that the incoming data will be
pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
range, or maybe be linear (no re-comp).

No, the monitor knows nothing about how the incoming
video data is biased; the video source (the host PC) MAY
apply a pre-compensation based on what it knows of the
monitor's response curve (based on the gamma value given
in EDID). But the "correction" the host applies to the
video data is not the issue here. (Whether or not any
correction SHOULD be applied is another matter, and one
that probably deserves some attention later on.) But all
the monitor really knows is that it's getting such-and-such
an input level.

The problem is that while the CRT provides, just by its
nature, a nice "gamma" curve (it's nice for a number of
reasons, not the least of which is that it's a very good match
to the inverse of the human eye's own response curve -
the bottom line result being that linear increases in the input
video level LOOK linear to the eye, even though the actual
output of light from the tube is varying in an objectively
non-linear fashion), the LCD does not do this. The LCD's
natural response curve, from a perceptual standpoint, is
ugly - a S-shaped curve which is sort of linear in the
middle and flattens out at both the black and white ends.

Why doesn't the look-up more fully adjust-out the
S-curve, so that color errors that can be corrected
with the simple exponent adjustment of typical graphics
card gamma control menus?

My guess is that because LCD subpixels are just barely
8-bit, a full correction might minimize color errors at
the expense of introducing visible terracing in gradients.

Even if they're fully eight bits, that's not enough IF you
are also advertising to the outside world (i.e., to those
devices ahead of the LUT) that you're providing a true
eight-bit accuracy. You've already mapped some of those
values off what they're expected to be, which in effect
will compress the curve in some areas and cause, for
instance, two successive input values to result in the same
ONE output value. You need finer control of the pixel
gray level, relative to the specified accuracy of the input
data, to be able to both compensate the response curve
AND provide that specified accuracy at all levels.

Bob M.
 
B

Bob Myers

Mxsmanic said:
The incoming data might be 8-bit, but there's no reason why the internal
correction of the monitor can't be carried out with much higher
granularity.

The "granularity" of the look-up table data is not the
limiting factor; it's the number of bits you have at the
input to the panel, vs. the numer of bits you claim to
have at the input to the overall system. If I map 8-bit
input data to, say, 10-bit outputs from the look up
table, I don't get as good a result as I want if the panel
itself has only 8 bits of accuracy. I need to at the very
least call in some additional tricks (which ARE available
- some frame-to-frame dithering can help, for example)
to be able to take advantage of the greater accuracy in
in the middle of the chain.

Bob M.
 
B

Bob Niland

Remember that, in theory, there's no fixed upper limit to
horizontal resolution on a CRT, although the mask or grille
spacing imposes some practical restrictions.

Not to mention circuit bandwidth, beam spot size,
beam focus and grill diffraction.
So you could be seeing additional detail on the
CRT that the LCD cannot display, in some cases.

My impression is less detail on the CRT. Each LCD triad
definitely represents one graphics card frame buffer pixel.
On the CRT, each fb pixel gets smeared into its neighbors
a bit, via one or more of the above mechanisms.
It has to be distortion of the signal. The panel is just
going to sample the signal, so if there's a pink halo on
the screen, there's one in the signal.

I've little doubt that the artifacts are due to the analog
connection outside the monitor. And they probably would
improve if I used a single shorter run of HD15 cable.
Maybe you just need to remove the switch and cable.

Normally, it's only used for temporary PC connections,
so it's not an on-going issue.
 
B

Bob Myers

But there are opportunities for the signal to get
visibly degraded if it goes to analog before it gets
to the LCD panel lattice. In the entirely unscientific
test I just ran, where I saw exactly what I expected to
see, the analog happened to be running through two 2m
lengths of HD15 cable and a KVM switch. The LCD image
went from pixel-perfect to slightly fuzzy, and perhaps
also reduced "contrast".

Oh, sure - but then, that's a bad thing to do to any connection.
Have you tried the corresponding experiment with a
digital interface running at its max. pixel rate? (Nope -
because passive switchboxes and the like simply don't
work with digital interfaces.) In an apples-to-apples
comparison, say a VGA vs. a DVI over the standard
2-3 meters of good quality cable in each case, the
differences you will see are due to sampling errors in the
analog case. Or in other words, the advantage of the digital
interface is that it brings its "sampling clock" along with
the data.

Umm, if the bits in the frame buffer are going thru a
DAC (which can introduce noise and distortion), then
thru a cable (which <ditto>), even if the LCD is not using
an ADC, and is using the analog signal directly, that
extra noise and distortion may show up on screen.

Sure; the question is always going to be whether or not
that "noise and distortion" is below the level we care
about. Digital interfaces are not error-free, either; that
they are acceptable, when they are, is the result of the bit
rate being below perceivable levels. Similarly, if the analog
interface delivers a stable image with the video data to
the desired level of amplitude accuracy (in most cases here,
to an 8 bit/sample level, or an accuracy of about +/- 1,5 mV
in "analog" terms), the difference between the two interfaces
will not be distinguishable. It is ALWAYS a matter of how
good is good enough, and neither type of connection is
ever truly "perfect."

I sorta suspected that, but in the DVI-D model, the
signal remains digital until it hits the rows & columns, no?

Well, until it hits the column drivers, yes. On the other hand,
there HAVE been LCD panels made, notably by NEC,
which preserved the analog video signal in analog form clear
through to the pixel level.

Does the typical analog-only LCD have a DAC? Or does it
just sample the analog signal and route values to drivers?

It has an ADC right up front - it generally has to, especially
if it supports any sort of image scaling, which is definitely
something best done in the digital domain. Scaling does
not necessarily imply a full frame buffer; modern scalers
make do with a few lines' worth of buffering, unless
frame rate conversion is also required - in which case at
least a good deal of a frame's worth of data must be stored,
and in the best versions a full frame buffer or two of memory
is used.


Even if the clocks align, there's also the matter of
whether or not the analog signal has completely slewed
to the value needed. If the DAC-cable-ADC path has
bandwidth-limited (softened) the transitions, or
introduced color-to-color skews, that will show up.
I see it, or something like it, doing analog on my LCD.

Sure - but you can't really lay the blame for having a BAD
analog interface on analog connections in general. The
point is that a very good interface is still most definitely possible
in the analog domain, and is in fact achieved quite often. There
are also analog systems which take advantage of the rather
forgiving nature of analog to enable truly cheap and nasty
cables, connectors, etc., at the expense of performance.
Digital, as noted, either works or it doesn't - which is a big
part of the reason that digital interfaces are not as inexpensive
as the cheapest (and lowest quality!) of the analog types.
You simply HAVE to meet a certain minimum level of
performance with digital, or you don't get to play AT ALL.
I suspect it's irrelevant at this point. Analog is
the "economy" graphics connect now, and what we have
is sufficient for the market.

Possibly; we'll see how it plays out. While digital
interfaces are becoming a lot more popular, analog
connections still account for well over 80% of the
video actually being used in the desktop monitor
market, even though LCDs took over from CRTs
as the unit volume leader this past year. As you know,
a gargantuan installed base has certain advantages
(or problems, which is often a different word for the
same thing! :)).

Bob M.
 
B

Bob Myers

Mxsmanic said:
Analog can also be more perfect than digital. In fact, it is always
possible to build an analog system that is superior to any given digital
system--if money is no object.

Exactly. Both are simply means of encoding information
for transmission; when comparing "analog" to "digital," the
best that you can ever do is to compare one given
implementation of "analog" vs.a given implementation of
"digital." Neither "analog" nor "digital" is inherently
superior to the other, per se. Each has its own advantages
and disadvantages, and there is a lot of misunderstanding
as to just what those are in each of these.

Bob M.
 
B

Bob Myers

Mxsmanic said:
The best analog system will always beat the performance of the best
digital system.

Unfortunately, I'm going to have to disagree with that, as
well; as I noted in another response here, neither type of
interface, per se, is inherently superior to the other.
Both are ultimately limited by the Gospel According to
St. Shannon, which puts strict limits on how much data
you can get through a given channel REGARDLESS of
how that data is encoded. Now, a particular sort of
a digital interface may or may not be superior to a
particular sort of analog; it depends on the specific
characteristics of the interfaces in question, and just what
is important, in a given application, in determining
"superior."

This is why the _best_ analog audio systems can consistently beat the
best digital systems.

That's not the only reason for this; high-end audio also
incorporates huge dollops of what can only be seen as
"religious" beliefs, with no basis in reasoning or evidence,
re a given individuals' views on what is "superior." (I
mean no disrespect to religion in saying this; I am simply
noting that there is a difference in kind between a belief
held solely on faith, and one arrived at through a careful
and objective consideration of evidence.) In the case of
audio, an awful lot of what has been claimed for the various
"digital" and "analog" systems is quite simply wrong.
(This isn't the place for that discussion - I'm sure it
continues, unfortunately quite healthy after all these years,
over in rec.audio.high-end, a group I left a long time ago
for just this reason. There's just no sense in discussing
something when very few are interested in anything
other than argument by vigorous assertion.)


Every interface between the digital world and the physical world is
analog, so all input and output devices are ultimately analog devices.

No. This is a common misconception regarding what is
meant by the term "analog." It does NOT necessarily mean
a system which is "continuous," "linear," etc., even though
in the most common forms of analog systems these are
often also true. "Analog" simply refers to a means of encoding
information in which one parameter is varied in a manner
ANALOGOUS TO (and hence the name) another - for
example, voltage varying in a manner analogous to the original
variations in brightness or sound level. The real world is
not "analog" - it is simply the real world. "Analog" points
to one means of describing real-world events, as does
"digital."
"Digital" only means something in the conceptual world of information
representation.

"Digital" is simply another means of representing information;
one in which the information is described as a series of
"digits" (numbers), and again, this is reflected in the name.
It is neither inherently less accurate or more accurate than
"analog" per se - that comparison always depends on the
specifics of the two implementations in question.

If you want a truly painful and detailed treatment of this
question (well, it HAS been one of my hot butons), I
spent a whole chapter on the subject in my book
"Display Interfaces: Fundamentals & Standards."


Bob M.
 
B

Bob Myers

Bob Niland said:
Is anyone prepared to argue that using an HD15 analog
connection to an LCD monitor provides a "better" presentation?

Sure - but first, you have to define "better." :)

Bob M.
 
B

Bob Myers

Mxsmanic said:
Note also that aliasing is usually a sign of lower resolution, not
higher resolution.

Well, no - "aliasing," if that's truly what a given observation
is all about, is always a sign of improper sampling, whether
it's in an analog situation or a digital one. See "Nyquist
sampling theorem" for further details.

The classic sort of "aliasing" in displays is the good ol'
Moire pattern common to CRTs. What few people realize
is that such patterns were in the past seen as GOOD things
when a CRT maker was testing a new tube, as being able
to see the Moire pattern was a visible indication that the
tube was able to focus sufficiently well!

Bob M.
 
B

Bob Niland

Possibly; we'll see how it plays out. While digital
interfaces are becoming a lot more popular, analog
connections still account for well over 80% of the
video actually being used in the desktop monitor
market, even though LCDs took over from CRTs
as the unit volume leader this past year. As you know,
a gargantuan installed base has certain advantages
(or problems, which is often a different word for the
same thing! :)).

Does NAVI bring any benefits to the installed base of
CRTs? Does it matter if it does?

If it does bring benefits to LCD via analog connect,
does that matter? I suspect the users who care about
whatever NAVI promises, will tend to go digital.

And I have a suspicion that the temptation on entry-
level PCs in the near future will be an analog-free
connection. A dumb UMA frame buffer, exposed thru a
TMDS chip thru a DVI-D (only) port on the back panel,
thru a DVI-D (only) cable, to a DVI-D (only) monitor.
Omits a couple of buffers, a DAC, an ADC (maybe) and
some copper. Maybe only runs at native res. Does DVI
allow captive cable at display?

The entire concept of "high end CRT" is already dead,
and increasingly what remains of new CRTs in the market
will tend toward junk (or be seen as so). The momentum
to flat panel (LCD or not) may cause the entire analog
graphics connection to go the way of the impact printer
before NAVI can get a foothold.
 
B

Bob Myers

Mxsmanic said:
Remember that, in theory, there's no fixed upper limit to horizontal
resolution on a CRT,

No, there is ALWAYS an upper limit to the resolution of
a CRT - for the simple reason that, even in theory, an
infinite bandwidth channel is not possible. Any limitation
on bandwidth in the video signal path represents a
resolution limit. And with respect to the CRT specifically,
other resolution limits come in due to the lower limits on the
physical spot size and the ability of the tube to switch the
beam on and off (i.e., you can't make a CRT without
capacitance in the gun structure, so you can never get an
infinitely short rise/fall time unless you can come up with a
video amp that's a perfect voltage source, capable of
delivering infinite current when needed).
although the mask or grille spacing imposes some
practical restrictions. So you could be seeing additional detail on the
CRT that the LCD cannot display, in some cases.

And the mask or grille, along with the phosphor dot structure,
places some very similar limits on the resolution available
from the CRT as does the physical "pixel" structure of the
LCD or other FPD type. (Whether or not the limits are the
SAME for a given pixel pitch is really more a question of
such things as whether or not the LCD in question permits
sub-pixel addressing, which few so far do.)


Bob M.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top