flat panel monitors

B

Bob Niland

No, the monitor knows nothing about how the incoming
video data is biased; the video source (the host PC) MAY
apply a pre-compensation based on what it knows of the
monitor's response curve (based on the gamma value given
in EDID).

I was using "know" in the metaphorical sense. The
monitor maker knows that the signal is apt to be
either linear, or pre-comped in the 1.8 - 2.6 gamma
range ...

.... and that if the user has any tool for dealing with
a mismatch of expectations, it's apt to be just a simple
exponent control, and maybe ganged (can't separately
adjust R, G and B).
(Whether or not any correction SHOULD be applied is
another matter, and one that probably deserves some
attention later on.)

Is a gamma standard a topic of any of the follow-on
standards to DVI? Packet? Send-changed-data-only?
Even if they're fully eight bits, that's not enough IF you
are also advertising to the outside world (i.e., to those
devices ahead of the LUT) that you're providing a true
eight-bit accuracy. You've already mapped some of those
values off what they're expected to be, which in effect
will compress the curve in some areas and cause, for
instance, two successive input values to result in the same
ONE output value. You need finer control of the pixel
gray level, relative to the specified accuracy of the input
data, to be able to both compensate the response curve
AND provide that specified accuracy at all levels.

No problem, just do error-diffused dithering in the
monitor's full-frame buffer :)

Now this could be done in the host, but then we'd need
some new VESA standard for reading back the tables of
stuck values.
 
M

Mxsmanic

Bob said:
Possibly; we'll see how it plays out. While digital
interfaces are becoming a lot more popular, analog
connections still account for well over 80% of the
video actually being used in the desktop monitor
market, even though LCDs took over from CRTs
as the unit volume leader this past year.

If my memory serves me correctly, the earliest monitor connection
interfaces for PCs (CGA and EGA, for example) were _digital_ connection
interfaces. VGA went "backwards" to analog to provide higher
resolutions and color depths, and greater flexibility.
 
M

Mxsmanic

Bob said:
The entire concept of "high end CRT" is already dead ...

Not for the most critical uses. A high-end CRT is still the best image
quality overall, if you really need the absolute best.

CRTs also still dominate at the low end, since they are ten times
cheaper than flat panels.

As in so many other domains, the advantages of digital do not involve
actual quality, but instead they involve convenience. And in the
imaging field, the usual cost advantage of digital doesn't exist,
either--digital imaging equipment is at least as expensive as analog
equipment, because of the bandwidths required.
and increasingly what remains of new CRTs in the market
will tend toward junk (or be seen as so).

CRTs are projected to be clear leaders on the market for years to come.
Flat panels receive all the media hype, but they are not actually
running the show. It all reminds me of the very similar situation in
"digital" (electronic) photography vs. film photography.
The momentum
to flat panel (LCD or not) may cause the entire analog
graphics connection to go the way of the impact printer
before NAVI can get a foothold.

Not likely any time soon. The inertia of the computer industry today is
enormous; things no longer change over night. The VGA interface may be
around indefinitely, and some users are still using earlier interfaces
(which, ironically, were digital).
 
M

Mxsmanic

Bob said:
Not to mention circuit bandwidth ...

Circuit bandwidth places an even greater restriction on digital
transmission. For any given channel speed, the real-world capacity of
the channel is always lower for digital transmission than for analog
transmission.

Remember that digital transmission is nothing more than declaring an
arbitrary signal level as a noise threshold, and considering anything
below it as noise and anything above it as information. Inevitably,
this reduces the information-carrying capacity of the channel.
... beam spot size, beam focus and grill diffraction.

True, but CRT manufacture is extremely mature, and amazing things can be
done.

There was a time when NTSC meant "never the same color," but even NTSC
is amazingly precise these days--more so than many people would have
ever thought possible.
My impression is less detail on the CRT. Each LCD triad
definitely represents one graphics card frame buffer pixel.
On the CRT, each fb pixel gets smeared into its neighbors
a bit, via one or more of the above mechanisms.

The total information content on the screen is the same, though.

Some high-end CRTs for broadcast video use have a mode that deliberately
reduces bandwidth in order to produce a more natural-looking image
through the filtering of high-frequency signal that bandwidth
restriction produces. CRTs can handle extremely high resolutions if
need be.
 
M

Mxsmanic

Bob said:
No, there is ALWAYS an upper limit to the resolution of
a CRT - for the simple reason that, even in theory, an
infinite bandwidth channel is not possible.

But I said no _fixed_ upper limit. The upper limit depends on the
performance of all the components in the chain. Ideally it is equal to
or better than the design limit of those components.

So a CRT might be designed to provide x resolution, but in fact it might
stretch to x+10% resolution. Of course, when digital elements are
present in the chain, the extra resolution, if any, is wasted.
 
M

Mxsmanic

Bob said:
Unfortunately, I'm going to have to disagree with that, as
well; as I noted in another response here, neither type of
interface, per se, is inherently superior to the other.

But all digital systems are simply analog systems operated in a
predefined way that declares anything below a certain threshold to be
noise. So the capacity of a digital system is always inferior to that
of an analog system with similar components and bandwidth.

Furthermore, the physical interface at either end of any system is
_always_ analog, so the system as a whole is never better than the
analog input and output components.

It's possible to surpass analog if you are building a system that does
not interface with the physical world. For example, if the system
handles _only_ information (such as accounting data), then you can
easily surpass analog performance with digital methods. But for any
system that requires a physical interface--audio, video, etc.--no
digital system can ever be better than the best possible analog system.
This is inevitable because all digital systems of this kind are just
special cases of analog systems.
Both are ultimately limited by the Gospel According to
St. Shannon, which puts strict limits on how much data
you can get through a given channel REGARDLESS of
how that data is encoded.

Yes. If the channel is analog, the limit of the channel's capacity is
equal to the limit imposed by Shannon. But if the channel is digital,
the limit on capacity is always below the theoretical limit, because you
always declare some portion of the capacity to be noise, whether it
actually is noise or not. This is the only way to achieve error-free
transmission, which is the advantage of digital.

In analog systems, there is no lower threshold for noise, but you can
use the full capacity of the channel, in theory, and in practice you're
limited only by the quality of your components. In digital systems, you
declare _de jure_ that anything below a certain level is noise, so you
sacrifice a part of the channel capacity, but in exchange for this you
can enjoy guaranteed error-free transmission up to a certain speed.
That's not the only reason for this; high-end audio also
incorporates huge dollops of what can only be seen as
"religious" beliefs, with no basis in reasoning or evidence,
re a given individuals' views on what is "superior."

Not necessary. Ultimately, audio systems (and imaging systems) depend
on analog devices for input and output. So no system can ever be better
than the best analog system. This is inevitable for any system that
requires interfaces with the physical world, such as displays,
microphones, speakers, etc., all of which _must_ be analog.

The real problem with analog is not its ability to provide quality
(which is limited only by the limits of information theory) but the
extremely high expense and inconvenience of obtaining the best possible
quality. Digital provides a slightly lower quality for a dramatically
lower price.

Just look at flat panels: they provide defect-free images at a fixed
resolution, but they don't provide any higher resolutions. CRTs have no
fixed upper limit on resolution, but they never provide defect-free
images.
No. This is a common misconception regarding what is
meant by the term "analog." It does NOT necessarily mean
a system which is "continuous," "linear," etc., even though
in the most common forms of analog systems these are
often also true. "Analog" simply refers to a means of encoding
information in which one parameter is varied in a manner
ANALOGOUS TO (and hence the name) another - for
example, voltage varying in a manner analogous to the original
variations in brightness or sound level. The real world is
not "analog" - it is simply the real world. "Analog" points
to one means of describing real-world events, as does
"digital."

Analog reduces to using the entire channel capacity to carry
information, and tolerating the losses if the channel is not noise-free.
Digital reduces to sacrificing part of channel capacity in order to
guarantee lossless transmission at some speed that is below the maximum
channel capacity. With digital, you sacrifice capacity in order to
eliminate errors. With analog, you tolerate errors in order to gain
capacity.

Only analog systems can reach the actual limits of a channel in theory,
but ironically digital systems usually do better in practice. Part of
this arises from the fact that analog systems introduce cumulative
errors, whereas digital systems can remain error-free over any number of
components in a chain, as long as some of the theoretical capacity of
the chain is sacrificed in exchange for this.

I used to go with the "analogy" explanation for digital vs. analog, but
since everything in reality can be seen as _either_ a digital or analog
representation, this explanation tends to break down under close
examination. The explanation I give above does not, and it is
compatible with other explanations (for example, representing things
with symbols is just another form of the arbitrary threshold for noise
that I describe above).
 
M

Mxsmanic

Bob said:
The "granularity" of the look-up table data is not the
limiting factor; it's the number of bits you have at the
input to the panel, vs. the numer of bits you claim to
have at the input to the overall system. If I map 8-bit
input data to, say, 10-bit outputs from the look up
table, I don't get as good a result as I want if the panel
itself has only 8 bits of accuracy.

But the panel is driving analog pixels. If you get a 10-bit value from
the LUT, why can't you just change this directly to an analog voltage
and drive the pixels from it? You'll still be limited to 256 discrete
luminosity levels for a pixel, but east of those levels can be chosen
from a palette of 1024 steps between black and white. So you have more
precise control of gamma on output. You could use more bits to make it
even more precise.
 
B

Bob Niland

From a market standpoint, I hasten to add.

Sony, for example, has ditched all but one of
their CRTs, most recently the GDM-FW900 24" wide,
even though it sold for less than the 23" LCD
that replaced it. The entire Sony entry-level,
mid-range and hi-end consumer and business CRT
product line is done for. Sony was selling CRTs
using a "higher quality" positioning. The customers
took the extra cash and spent in on LCD.
Not for the most critical uses. A high-end CRT
is still the best image quality overall, if you
really need the absolute best.

And you pay dearly for that. The remaining Sony GDM-C520K
is a $2000 product. But customers other than graphics
professionals, who have $2K to spend, are spending
it on LCD. The wider market for "quality" CRTs is gone.
CRTs also still dominate at the low end, since
they are ten times cheaper than flat panels.

Not 10x. LCD prices have been collapsing. Using Wal-Mart
as a low-end reseller, their low-end 17"LCD is only
1.3x of their low-end 17"CRT. True, you can get into a
CRT for $70, and their cheapest LCD is $188, but that's
still only 2.7x.

You can watch the Asian press lament the near-daily LCD
pricing collapse at: said:
As in so many other domains, the advantages of digital
do not involve actual quality, but instead they involve
convenience.

It has ever been thus. In addition to being trendy and
cool, LCDs are cheaper to ship, use less power, turn on
faster, are easier to install and move around, take up
less space and are less of a problem at disposal time.
The small premium they still command is something an
increasing number of average users are willing to pay.
CRTs are projected to be clear leaders on the market for
years to come.

Only if someone is still making them.
It all reminds me of the very similar situation in
"digital" (electronic) photography vs. film photography.

Yep. I dumped all my 35mm gear on eBay last year, went
all-digital, and haven't regretted it for a moment.
Silver halide is racing CRT to the exit, but both will
be around for a while yet.
The VGA interface may be around indefinitely, and some
users are still using earlier interfaces (which,
ironically, were digital).

Yep, we've come full circle to CGA and EGA :)
 
M

Mxsmanic

Bob said:
From a market standpoint, I hasten to add.

Even that I wonder about. Flat panels are the rage in developed
countries, but CRTs still have a market elsewhere, since they are so
cheap.
Sony, for example, has ditched all but one of
their CRTs, most recently the GDM-FW900 24" wide,
even though it sold for less than the 23" LCD
that replaced it.

I'm not sure that this was a good decision on Sony's part, but then
again, Mr. Morita has been dead for quite a while now.
The entire Sony entry-level,
mid-range and hi-end consumer and business CRT
product line is done for. Sony was selling CRTs
using a "higher quality" positioning. The customers
took the extra cash and spent in on LCD.

So all the Artisan buyers chose LCDs instead? That's hard to believe.
And you pay dearly for that. The remaining Sony GDM-C520K
is a $2000 product.

About the same as any decent mid-range LCD. My little flat panel cost
that much.
Not 10x. LCD prices have been collapsing.

You can get CRTs for $60 or so.
True, you can get into a
CRT for $70, and their cheapest LCD is $188, but that's
still only 2.7x.

For a large segment of the market, that's a lot.
You can watch the Asian press lament the near-daily LCD
pricing collapse at: <http://www.digitimes.com/>

Why do they have a problem with it? I thought margins were small.
Only if someone is still making them.

They will likely be made in Asia for quite some time. There are still
several billion people there without monitors.
Yep. I dumped all my 35mm gear on eBay last year, went
all-digital, and haven't regretted it for a moment.

I still shoot film.
Silver halide is racing CRT to the exit, but both will
be around for a while yet.

The demise of CRTs has been predicted for forty years, and we are still
waiting.
Yep, we've come full circle to CGA and EGA :)

A lot of the people making the decisions today are too young to remember
CGA and EGA, so they think they're inventing something new.
 
B

Bob Niland

Mxsmanic said:
So all the Artisan buyers chose LCDs instead?

No, the last remaining Sony CRT, the GDM-C520K,
is an Artisan.
You can get CRTs for $60 or so.

Even though CD audio media was higher priced than
LP, and CD players were substantially higher priced
than turntables, CD still killed LP surprisingly rapidly.
Just because the old stuff is cheaper, and arguably
"better", may not save it. Market forces have a
logic of their own that isn't necessarily logical.
The demise of CRTs has been predicted for forty
years, and we are still waiting.

Well, flat panel TV had been only ten years away
for the last 50 years. It's here now. When the
existing TVs in this household fail, they'll get
replaced by something flat, for any number of
reasons.

Note Bob Myers observation that LCD sales eclipsed
CRT within the last year. That's a fairly important
event, and won't go unnoticed by industry planners.

Curiously, I also note that Apple has entirely
dropped CRTs from their product line. That really
surprised me, because I'm not convinced that LCD
is really ready yet for pre-press, broadcast DCC,
video post and movie post (entirely apart from
the recent user complaints about the color
uniformity and stability of the Cinema 23).
 
M

Mxsmanic

Bob said:
No, the last remaining Sony CRT, the GDM-C520K,
is an Artisan.

I had read elsewhere that even production of that had stopped.
Even though CD audio media was higher priced than
LP, and CD players were substantially higher priced
than turntables, CD still killed LP surprisingly rapidly.

But that's not a valid analogy.

CDs and LPs are storage media, not input or output devices. There are
tremendous practical advantages to digital storage over analog storage;
these advantages ensured success for the CD format.

For input and output devices, the situation is different. For one
thing, they are all analog, whether they are called digital or not. And
because of this, there's no intrinsic advantage to moving to "digital"
devices such as electronic cameras or flat-panel displays. You're
really just exchanging one analog technology for another. The
advantages of a newer technology cannot be taken for granted; it may or
may not be superior to the old technology. And even in the best of
cases, it may take a very long time to become dominant over the older
technology. And most importantly of all, none of it is really
"digital." LCDs depend on variable voltages just as CRTs do.
Permanently dividing the screen into discrete pixels does help for
things like geometry, but it hurts for things like resolution (only one
resolution works if the pixels are fixed on the screen).
Just because the old stuff is cheaper, and arguably
"better", may not save it. Market forces have a
logic of their own that isn't necessarily logical.

That doesn't mean that one must throw up one's hands and follow the
market.
Well, flat panel TV had been only ten years away
for the last 50 years. It's here now. When the
existing TVs in this household fail, they'll get
replaced by something flat, for any number of
reasons.

Not unless the flat panels cost about the same as the tubes. The
majority of TV owners in the world can barely afford a tube TV, much
less a flat panel.

And in computerland, there are still people out there running Windows
3.1 on 80386 machines. They aren't going to rush out and buy flat
panels.
Note Bob Myers observation that LCD sales eclipsed
CRT within the last year. That's a fairly important
event, and won't go unnoticed by industry planners.

It's important not to overestimate the significance of short-term
trends. CRTs are a replacement market; flat panels are often new
purchases (either unnecessary replacements or completely new
acquisitions). Digital photography is seeing the same thing.
Curiously, I also note that Apple has entirely
dropped CRTs from their product line. That really
surprised me, because I'm not convinced that LCD
is really ready yet for pre-press, broadcast DCC,
video post and movie post (entirely apart from
the recent user complaints about the color
uniformity and stability of the Cinema 23).

Professionals who need more quality probably weren't buying their
monitors from Apple to begin with. There are lots of specialized
manufacturers who probably do a better job than Apple in this domain.

Come to think of it, I can't remember the last time I saw an Apple
CRT--the iMac maybe? I don't look much at Apple machines, though.
 
B

Bob Myers

Mxsmanic said:
But the panel is driving analog pixels. If you get a 10-bit value from
the LUT, why can't you just change this directly to an analog voltage
and drive the pixels from it?

That's exactly the problem; there are as of yet no
10-bit column drivers (which are the components within
the LCD panel that convert the digital input information
into an analog voltage to drive the pixels) in mainstream use.
10-bit drivers have only recently been introduced at ALL,
and I have heard from some manufacturers that so far they're
not seeing acceptable noise performance from these. And
obviously, they're also more expensive than 6-bit or 8-bit
drivers, which are the mainstream components right now.
But these problems very clearly are being addressed, and it's
only a matter of time before we have 10-bit control at the
pixel level.

Bob M.
 
B

Bob Myers

Bob Niland said:
I was using "know" in the metaphorical sense. The
monitor maker knows that the signal is apt to be
either linear, or pre-comped in the 1.8 - 2.6 gamma
range ...

Still "no," with the exception for TV products intended
for use with a given broadcast TV standard. It doesn't
really matter much for the purposes of this discussion,
since the bottom line remains the same - no matter what
application we're talking about, we'd LIKE it for LCD
monitors to deliver a "CRT-like" response curve with a
gamma of around 2.2-2.5. The problem is how that
might be achieved.
... and that if the user has any tool for dealing with
a mismatch of expectations, it's apt to be just a simple
exponent control, and maybe ganged (can't separately
adjust R, G and B).

Most likely, and in fact some products do provide
such controls to the user. The problem remains that
there is not a sufficiently fine degree of control provided
by the current hardware to allow the curve to be matched
well while NOT impacting the accuracy of the response
elsewhere.

Is a gamma standard a topic of any of the follow-on
standards to DVI? Packet? Send-changed-data-only?

Not with regard to the interface standards themselves, no.
There ARE, of course, various standards which define the
gamma that "should" be provided by the display, such as
sRGB and the aforementioned broadcast TV signal standards.


Bob M.
 
B

Bob Myers

Bob Niland said:
Does NAVI bring any benefits to the installed base of
CRTs? Does it matter if it does?

With respect to the first question - yes, there are some
features in NAVI that woule be useful to CRTs, if any were
to implement them. Among these are what amounts to an
"automatic gain control" feature (i.e., the display could
automatically compensate for signal amplitude errors,
including cable losses) and a channel for carrying digital
audio information over the VGA interface.

I'm not sure how to answer the second question.

If it does bring benefits to LCD via analog connect,
does that matter? I suspect the users who care about
whatever NAVI promises, will tend to go digital.

Oddly enough, one of the other features that the NAVI
standard provides is a means to do true "digital"
transmission of the video information over the VGA
interface.
And I have a suspicion that the temptation on entry-
level PCs in the near future will be an analog-free
connection. A dumb UMA frame buffer, exposed thru a
TMDS chip thru a DVI-D (only) port on the back panel,
thru a DVI-D (only) cable, to a DVI-D (only) monitor.
Omits a couple of buffers, a DAC, an ADC (maybe) and
some copper. Maybe only runs at native res. Does DVI
allow captive cable at display?

I think so - but the biggest problem in the above scenario
is that the PC industry has yet to get away from the model
that says the display is a very flexible device in terms of the
range of formats and timings it should be expected to accept.
God knows we ARE trying to get there, though...


Bob M.
 
B

Bob Myers

Mxsmanic said:
countries, but CRTs still have a market elsewhere, since they are so
cheap.

The question, though, is to a very large degree NOT
whether a market exists, but if there will be anyone left
willing to make the CRTs. CRT production is not something
that some little niche player will be able to crank up and keep
going in their garage; it takes a pretty sizable commitment of
capital. Not as much as an LCD fab, I grant you, but it's
still something that's going to be the domain of the big boys
- and once they're tired of making them, they're GONE.
So all the Artisan buyers chose LCDs instead? That's hard to believe.

Who says that they will have a choice? You can't buy a
product which is no longer manufactured.



Why do they have a problem with it? I thought margins were small.

Exactly...and the farther the prices collapse, the worse the
margins get. And SOMEONE has to pay for those nice
shiny new billion-dollar fabs that are driving these prices
down in the first place.
They will likely be made in Asia for quite some time. There are still
several billion people there without monitors.

That CRTs are made SOMEWHERE is no guarantee that
CRTs of the type and quality we've been speaking of here are
still available in the market. What you're going to see coming
out of the "new" Asian sources (e.g., mainland China) are almost
certainly going to be entry-level products only. The high end
WILL go to non-CRT technologies, by necessity.


Bob M.
..
 
B

Bob Myers

Mxsmanic said:
I had read elsewhere that even production of that had stopped.

I believe you have read correctly.
But that's not a valid analogy.

I disagree; even though the CD and LP are both
examples of storage technologies, the example DOES
apply in the sense of a new, incompatible technology
displacing an older one. The discs themselves may be
simply "storage devices," but to move to CD the user also
had to buy a new sort of player that FUNCTIONALLY
did exactly what the old one (the turntable) did - it "read"
the information from the storage medium and converted
it to an audio signal. That's a very close analogy to the
LCD vs. CRT situation. The LCD fundamentally DOES
what a CRT does - it presents images to the viewer -
but through a completely different technology, and one which
does not co-exist in the same manufacturing environment
as its predecessor. It is a displacement sort of change in the
market, rather than simply an "upgrade" to a "new and better"
version of the same sort of thing (as occurs, for instance,
every time a new model of car is introduced).

For input and output devices, the situation is different. For one
thing, they are all analog, whether they are called digital or not.

Well, no, not really, but then I've already said enough about that
elsewhere. There are most certainly "digital" display devices (i.e.,
those that fundamentally deal with "numbers" rather than analog
control) - the LCD just doesn't happen to be one of them.

Permanently dividing the screen into discrete pixels does help for
things like geometry, but it hurts for things like resolution (only one
resolution works if the pixels are fixed on the screen).

What do you think the phosphor triads of a CRT do?

That doesn't mean that one must throw up one's hands and follow the
market.

Actually, in many cases it DOES. If the forces of the market
as a whole wind up dictating that the product you want to buy
is no longer produced, and you yourself do not have the resources
to continue to make that product on your own, you ARE forced
to follow the market. That situation is very rapidly coming to be
in the case of the high-end CRT market. Very soon, there simply
won't be any such things to be had. (An analogy: no matter how
much you might want to buy a factory-fresh Ford Model T, there
simply is no such thing on the market these days.)

Not unless the flat panels cost about the same as the tubes. The
majority of TV owners in the world can barely afford a tube TV, much
less a flat panel.

Which is why the CRT will continue for quite some time as the
entry-level display of choice, and similarly will continue to dominate
the under-30" classes of TV products. But again, that's not the
market we've been talking about here.

It's important not to overestimate the significance of short-term
trends.

But this is not a short term trend. There is NO ONE in
the business today who expects the CRT to regain any
market share in the desktop monitor arena, and the trend
of growing LCD share and declining CRT has been going
on for ten years now (it simply accelerated a lot in the
last few years, as LCD prices came down through certain
"magic" levels). Things are looking a bit better for the CRT
in the TV space, where it will continue to hold on to significant
market share at least through this decade and likely into the
next, but again no one is expecting the trend line to change
direction.


Bob M.
 
B

Bob Myers

Mxsmanic said:
If my memory serves me correctly, the earliest monitor connection
interfaces for PCs (CGA and EGA, for example) were _digital_ connection
interfaces. VGA went "backwards" to analog to provide higher
resolutions and color depths, and greater flexibility.

Technically, the very earliest monitor connections for
PCs were analog - since the very earliest PCs, not having
an established "monitor" market of their own, simply provided
either TV outputs on an RF connection or baseband video,
and used the home TV as the display. CGA and EGA were
"digital", but then why would VGA be "going backwards" simply
because it was analog? It can be argued that the very earliest
form of electronic (well, "electric," at least) communication was
digital in nature - everyone remember Mr. Samuel F. B. Morse?

Bob M.
 
B

Bob Myers

Mxsmanic said:
But all digital systems are simply analog systems operated in a
predefined way that declares anything below a certain threshold to be
noise. So the capacity of a digital system is always inferior to that
of an analog system with similar components and bandwidth.

No. Fundamentally, there is no such thing as an "analog"
system or a "digital" system - there is just electricity, which
operates according to the same principles whether it is
carrying information in either form. The capacity of a given
communications channel is limited by the bandwidth of the
channel and the level of noise within that channel, per Shannon;
but Shannon's theorems do NOT say what the optimum form
of encoding is, in the sense of whether it is "analog" or
"digital." My favorite example of a device which pushes channel
capacity limits about as far as they can go is the modem - and
do you call the signals that such a device produces "analog"
or "digital"? The answer is that they are purely digital - there
is absolutely nothing in the transmitted signal which can be
interpreted in an "analog" manner (i.e., the level of some
parameter in the signal is directly analogous to the information
being transmitted). The signal MUST be interpreted as symbols,
or in simplistic terms "numbers," and that alone makes it
"digital." The underlying electrical current itself is neither analog
nor digital from this perspective - it is simply electricity.

Furthermore, the physical interface at either end of any system is
_always_ analog, so the system as a whole is never better than the
analog input and output components.

This is also an incorrect assumption; I can give several examples
of I/O devices which behave in a "digital" manner.
Yes. If the channel is analog, the limit of the channel's capacity is
equal to the limit imposed by Shannon. But if the channel is digital,
the limit on capacity is always below the theoretical limit, because you
always declare some portion of the capacity to be noise, whether it
actually is noise or not. This is the only way to achieve error-free
transmission, which is the advantage of digital.

No. Shannon's theorems set a limit which is independent of the
signal encoding, and in fact those real-world systems which come
the closest to actually meeting the Shannon limit (which can never
actually be achieved, you just get to come close) are all currently
digital. (The digital HDTV transmission standard is an excellent
example.) Conventional analog systems do not generally approach
the Shannon limit; the reason for this becomes apparent once the
common misconception that an analog signal is "infinitely"
variable is disposed of.

In analog systems, there is no lower threshold for noise,

And THAT is the fancy version of the above misconception.
There is ALWAYS a noise floor in any real-world channel,
and there is always a limit to the accuracy with which an analog
signal can be produced, transmitted, and interpretated which
the various noise/error floors set. It is simply NOT POSSIBLE,
for example, for the commonanalog video systems over typical
bandwidths to deliver much better than something around 10-12 bit
accuracy (fortunately, that's about all that is ever NEEDED, so
we're OK there). I defy you, for instance, to point to an example
in which analog information is transmitted at, say, 24-bit (per
component) accuracy over a 200 MHz bandwidth.

but you can
use the full capacity of the channel, in theory, and in practice you're
limited only by the quality of your components.

But even in theory, those components cannot be "perfect." A
transistor or resistor, for example, MUST produce a certain
level of noise at a minimum. You can't escape this; it's
built into the fundamental laws of physics that govern these
devices. The short form of this is There Is No Such Thing As
A Noise Free Channel EVER - not even in theory.

Not necessary. Ultimately, audio systems (and imaging systems) depend
on analog devices for input and output. So no system can ever be better
than the best analog system.

That does not logically follow, for a number of reasons. What
DOES logically follow is that no system can ever be better
than the performance of the input and output devices (which
we are assuming to be common to all such systems), but this
says nothing about the relative merits of the intermediate components.
If it is possible for the best "digital" intermediate to better the
best "analog" intermediate, then the digital SYSTEM will be
better overall, unless BOTH intermediates were already so good
that the limiting factors were the common I/O devices. This is
not the case in this situation. (For one thing, it's not quite a
case of the chain being quite as good as its weakest link -
noise is ADDITIVE, just to note one problem with that model.)

Just look at flat panels: they provide defect-free images at a fixed
resolution, but they don't provide any higher resolutions. CRTs have no
fixed upper limit on resolution, but they never provide defect-free
images.

As has already been shown, CRTs most definitely have fixed
upper limits on resolution.
Analog reduces to using the entire channel capacity to carry
information, and tolerating the losses if the channel is not noise-free.
Digital reduces to sacrificing part of channel capacity in order to
guarantee lossless transmission at some speed that is below the maximum
channel capacity.

No. Here, you are actually comparing two particular versions
of "analog" and "digital," not fundamental characteristics of these
encodings per se. And the most common examples of "analog"
signalling do NOT, in fact, use the full channel capacity. (Even
if they did, a "digital" signalling method can also be devised which
does this - again, see the example of the modern modem or
HDTV transmission.)
With digital, you sacrifice capacity in order to
eliminate errors. With analog, you tolerate errors in order to gain
capacity.

"Capacity" is only meaningful if stated as the amount of information
which can be carried by a given channel WITHOUT ERROR.
Any error is noise, and represents a loss of capacity. What I
THINK you mean to say here is probably something like "quality,"
but in a very subjective sense.
I used to go with the "analogy" explanation for digital vs. analog, but
since everything in reality can be seen as _either_ a digital or analog
representation,

NO. Let me be more emphatic: HELL, NO. Reality is reality;
it is neither "digital" nor "analog." Those words do NOT equate
to "discrete" or "sampled" or "linear" or "continuous" or any other
such nonsense that often gets associated with them. They have
quite well-defined and useful meanings all on their own, and they
have to do with how information is encoded. Nothing more and
nothing less. The world is the world; "analog" and "digital" refer to
different ways in which we can communicate information ABOUT
the world (and they are not "opposites," any more than saying that
"red is the opposite of blue" is a meaningul statement).


Bob M.
 
B

Bob Myers

Mxsmanic said:
But I said no _fixed_ upper limit. The upper limit depends on the
performance of all the components in the chain. Ideally it is equal to
or better than the design limit of those components.

We must be using different meanings for the word "resolution",
then. I most certainly see, for instance, the phosphor dot and
shadow mask structure of the typical color CRT as imposing a
very fixed limit on resolution. Can that CRT be used to display
image formats which supposedly provide a greater number of
"pixels"? Sure - but that's not the same as RESOLVING them.

Bob M.
 
B

Bob Myers

Mxsmanic said:
Circuit bandwidth places an even greater restriction on digital
transmission. For any given channel speed, the real-world capacity of
the channel is always lower for digital transmission than for analog
transmission.

You are assuming the "digital transmission" must always
equate to simple binary encoding, one bit per symbol
for a given physical channel. That is not the case.
Remember that digital transmission is nothing more than declaring an
arbitrary signal level as a noise threshold, and considering anything
below it as noise and anything above it as information. Inevitably,
this reduces the information-carrying capacity of the channel.

But no more than the capacity is reduced by the channel anyway;
if a given level of noise exists in the channel, then the level of an
analog signal cannot be determined more precisely than the limit
imposed by the noise. It is exactly the same limit, for exactly the
same reasons, no matter what form of encoding is used. (It's
interesting to note that the Shannon limit is most meaningfully
expressed in units of bits/second or similar, but that the use of
such units does NOT imply that a "digital" system must be used
to transmit the information.)
True, but CRT manufacture is extremely mature, and amazing things can be
done.

To quote my favorite fictional engineer: "Ye canna change the
laws o' physics, Cap'n!" :) The limits of what you can do with a
CRT are pretty well known at this point, in large part because it
IS such a mature technology. We understand pretty well what can
be done in terms of shaping, accelerating, and directing a beam
of electrons.


Bob M.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top