flat panel monitors

M

Mxsmanic

Bob said:
I disagree; even though the CD and LP are both
examples of storage technologies, the example DOES
apply in the sense of a new, incompatible technology
displacing an older one.

But CDs were successful because of the clear superiority of digital
storage over analog storage. Flat-panel displays and similar devices
are not digital but analog, and they are input-output devices, not
storage devices, and so their advantages, if any, are far less patent,
and one cannot plausibly predict that they will succeed simply because
there is something "digital" about them in some respect (the only thing
they have in common with CDs).
The LCD fundamentally DOES
what a CRT does - it presents images to the viewer -
but through a completely different technology, and one which
does not co-exist in the same manufacturing environment
as its predecessor.

But the LCD has no "digital advantage." It does have other advantages
(and disadvantages), but being "digital" is not among them. It's still
an analog device, like any other display.
Well, no, not really, but then I've already said enough about that
elsewhere. There are most certainly "digital" display devices (i.e.,
those that fundamentally deal with "numbers" rather than analog
control) - the LCD just doesn't happen to be one of them.

Every display device eventually produces an analog output, and the
quality of this analog output usually determines most of the quality of
the displayed image.
What do you think the phosphor triads of a CRT do?

They are tiny, and the raster that excites them is adjustable. By
adjusting the raster you can perform very smooth interpolation with no
special circuitry. On an LCD display, you need special logic to
explicitly perform interpolation for non-native resolutions.

This was never much of an advantage for me because I always ran CRTs at
the highest resolution I could, but for people who like to run lower
resolutions, it's very handy.
 
M

Mxsmanic

Bob said:
The question, though, is to a very large degree NOT
whether a market exists, but if there will be anyone left
willing to make the CRTs.

From what I hear, places like China can produce them dirt cheap.
CRT production is not something
that some little niche player will be able to crank up and keep
going in their garage; it takes a pretty sizable commitment of
capital.

Nothing compare to building LCDs. So in poor markets with
underdeveloped industrial infrastructures, it would make sense to
continue building CRTs. I expect the Third World may continue to do
this for some time; indeed, they may be the only ones building
Trinitron-style CRTs soon (and to think how exclusive Sony used to
consider their technology!).
Who says that they will have a choice?

Why would anyone stop producing something that people are buying,
especially something with fat margins like a professional CRT?
Exactly...and the farther the prices collapse, the worse the
margins get. And SOMEONE has to pay for those nice
shiny new billion-dollar fabs that are driving these prices
down in the first place.

Someone must be making money if someone is still building the
fabrication facilities.
That CRTs are made SOMEWHERE is no guarantee that
CRTs of the type and quality we've been speaking of here are
still available in the market.

Maybe. But tomorrow's Chinese CRTs might be the equal of today's
Artisan CRTs. There's no fundamental obstacle preventing this. And the
demand might well exist in China or India.

Of course, if and when LCDs can be made to match or equal the best
possible CRTs at similar price points, it won't matter. Certainly in
that case I won't care what happens to CRTs.
 
M

Mxsmanic

Bob said:
CGA and EGA were
"digital", but then why would VGA be "going backwards" simply
because it was analog?

It wouldn't, at least not in my mind. But for many people today, analog
= backward, and digital = perfection.
 
M

Mxsmanic

Bob said:
You are assuming the "digital transmission" must always
equate to simple binary encoding, one bit per symbol
for a given physical channel. That is not the case.

No, I'm stating that the distinguishing characteristic of a digital
communications channel is that it places an arbitrary dividing line
between noise and useful signal. Anything below the line is noise;
anything above it is signal. The advantage is that zero loss can be
achieved up to a certain bandwidth. The disadvantage is that the full
bandwidth of the physical channel can never be used.
But no more than the capacity is reduced by the channel anyway;
if a given level of noise exists in the channel, then the level of an
analog signal cannot be determined more precisely than the limit
imposed by the noise.

True, but analog channels are extremely variable in their
characteristics. If the noise drops dramatically in an analog channel,
you can communicate more information. If it drops dramatically in a
digital channel, nothing changes--your upper limit on channel capacity
does not increase.
To quote my favorite fictional engineer: "Ye canna change the
laws o' physics, Cap'n!" :) The limits of what you can do with a
CRT are pretty well known at this point, in large part because it
IS such a mature technology. We understand pretty well what can
be done in terms of shaping, accelerating, and directing a beam
of electrons.

And what can be done still beats flat panels, for now.
 
M

Mxsmanic

Bob said:
We must be using different meanings for the word "resolution",
then. I most certainly see, for instance, the phosphor dot and
shadow mask structure of the typical color CRT as imposing a
very fixed limit on resolution.

Sure, but a lot of CRTs are of such poor quality that they can never hit
that limit, anyway.

And for years I actually used my CRT at a resolution that was just
slightly higher than that permitted by the grille spacing. It wasn't
high enough to produce obvious artifacts, though, especially under
normal viewing conditions. The pixel size was slightly smaller than a
phosphor stripe triad.
 
M

Mxsmanic

Bob said:
Conventional analog systems do not generally approach
the Shannon limit ...

Only because systems cannot be made noise-free. If you declare an
arbitrary noise limit (thus making the system digital), you know when
errors occur or you can at least design systems that are error-free (and
thus do not accumulate errors). If you set the limit at zero (an analog
system), you never really know.
I defy you, for instance, to point to an example
in which analog information is transmitted at, say, 24-bit (per
component) accuracy over a 200 MHz bandwidth.

You can approach 200 MHz as closely as you wish with either digital or
analog. With analog, you can't be sure how many errors you'll get.
With digital, you can predict how many errors you'll get.
If it is possible for the best "digital" intermediate to better the
best "analog" intermediate ...

It isn't.

Furthermore, in systems that do not involve a chain of components that
can accumulate errors, there's virtually no advantage to digital.
As has already been shown, CRTs most definitely have fixed
upper limits on resolution.

You've never used a monochrome CRT?
"Capacity" is only meaningful if stated as the amount of information
which can be carried by a given channel WITHOUT ERROR.
Any error is noise, and represents a loss of capacity. What I
THINK you mean to say here is probably something like "quality,"
but in a very subjective sense.

Digital is simply a way of keeping errors within a predictable range.
 
B

Bob Myers

Mxsmanic said:
From what I hear, places like China can produce them dirt cheap.

Yes, they can. But, having recently visited a number
of mainland-China display manufacturers, I would have
to also note that they're a long way from producing
a competitive high-end CRT display. Entry-level
stuff, sure.
Nothing compare to building LCDs. So in poor markets with
underdeveloped industrial infrastructures, it would make sense to
continue building CRTs. I expect the Third World may continue to do
this for some time; indeed, they may be the only ones building
Trinitron-style CRTs soon (and to think how exclusive Sony used to
consider their technology!).

The latest Trinitron technology IS still exclusive; the only
other manufacturer to ever produce it was NEC (as it
appeared in the NEC-Mitsubishi monitors), and that was
under license from Sony. I don't know if Sony still holds
any patents that would prohibit someone else from making
ANY sort of aperture-grille product, but they certainly still
own the IP that covers that technology in its recent forms.
And it is not that easy a tube to make...
Why would anyone stop producing something that people are buying,
especially something with fat margins like a professional CRT?

Because of the high overhead necessary to continue to make
the basic technology (in this case, the CRT) in light of a rapidly
diminishing market. You don't think Sony and NEC have stopped
making these things just because they were tired of playing
in the professional market, do you?

CRT plants ARE fairly expensive things to keep running, and it
simply is not feasible to run them for production runs that are
going to be measured in the tens of thousands per year, tops.
Someone must be making money if someone is still building the
fabrication facilities.

More accurately, someone has convinced their board of
directors that the investment in the new fab is a good idea,
financially, over the long haul. New fabs are money LOSERS
at the start - that sort of investment is a big hole to climb out
of, and it takes some time to get through enough revenue-
generating production to pay it all off. Recently, there have
been a LOT of companies making the bet that they can make
a new, large-size fab pay off, and as a result the industry may
be facing an oversupply situation. That, though, drives prices
down to the point where the less-financially-secure of these
companies may not be able to survive in the market, or at
least be able to continue to operate their nice, shiny new fab
on their own. LCDs are NOT guaranteed moneymakers for
everyone who gets into the market, and certainly are not
moneymakers at all in terms of getting a really quick
return on your investment. You have to be willing to commit
to the long haul, and then be able to actually survive over
that period.
Maybe. But tomorrow's Chinese CRTs might be the equal of today's
Artisan CRTs. There's no fundamental obstacle preventing this. And the
demand might well exist in China or India.

True, there's nothing fundamentally blocking this - but I
doubt that it will actually happen, for a number of reasons.
For one thing, remember that China is very keen on developing
their own high-end LCD capabilities - and the LCDs in general
keep getting more and more competitive in all markets, all the
time.

Bob M.
 
B

Bob Myers

Mxsmanic said:
But CDs were successful because of the clear superiority of digital
storage over analog storage. Flat-panel displays and similar devices
are not digital but analog, and they are input-output devices, not
storage devices, and so their advantages, if any, are far less patent,
and one cannot plausibly predict that they will succeed simply because
there is something "digital" about them in some respect (the only thing
they have in common with CDs).

Perhaps, but if you've read carefully here you'll note that
I'm not the one (if there IS anyone) claiming that LCDs
will succeed simply because "there is something digital
about them."
Every display device eventually produces an analog output, and the
quality of this analog output usually determines most of the quality of
the displayed image.

Well, in this we are still disagreeing as to what "analog" means. I
don't see anything "analog" about the actual image - it's just an
image, it is neither digital or analog per se.
They are tiny, and the raster that excites them is adjustable.

They are just about exactly the same size as the physical pixels
of an LCD. Now, assume that the LCD in question provides
sub-pixel addressing (i.e., the image is not restricted to have its
"pixels" align with the pixel boundaries of the LCD, but rather
on the sub-pixel boundaries) - how is the effect of the "pixelated"
LCD screen on the image any different from the effect of the
phosphor triad structure? (This is not to say that the two images
have the same "look", but in terms of the limits on what can
be resolved, is there any real difference?)


Bob M.
 
B

Bob Myers

Mxsmanic said:
Only because systems cannot be made noise-free. If you declare an
arbitrary noise limit (thus making the system digital), you know when
errors occur or you can at least design systems that are error-free (and
thus do not accumulate errors). If you set the limit at zero (an analog
system), you never really know.

An analog system never ever has "the limit at zero"; that's the
whole point. You do not "declare an arbitrary noise limit" in
"making a system digital"; digital systems can and have been
made which adapt themselves (through varying the number of
transmitted bits per symbol) to the level of noise in the channel
at any given moment, so there's no real reason that either mode
suffers from a capacity limitation before the other.
You can approach 200 MHz as closely as you wish with either digital or
analog.

Sorry, you missed the important part of that: can you maintain 24-bit
accuracy in an analog system which fills a 200 MHz bandwidth,
in any current practical example? (And yes, it IS meaningful to
speak of the capacity of an analog system in bits/second, or
the accuracy of such a system in bits; this is basic information
theory.)
It isn't.

But so far, this is merely an assertion on your part; you have
not given any real theoretical or practical reason why this
should be so. Again, practical examples exist of "digital"
systems which come very, very close to the Shannon limit
for their assumed channels. So what's this basic, inherent
limit that you seem to be assuming for "digital" transmissions?
You've never used a monochrome CRT?

Not only used them, I've designed around them. There
is most definitely an upper limit on the resolution of ANY
CRT; it's just more obvious in the case of the standard
tricolor type.


Bob M.
 
B

Bob Myers

Mxsmanic said:
And for years I actually used my CRT at a resolution that was just
slightly higher than that permitted by the grille spacing. It wasn't
high enough to produce obvious artifacts, though, especially under
normal viewing conditions. The pixel size was slightly smaller than a
phosphor stripe triad.

Sure - which means that you DID NOT fully resolve those
pixels. Did you get a nice, stable image? Sure. Was it
even a very good-looking image? Perhaps. But were you
past the resolution limit of the tube? Definitely. Resolution
has a very well-defined, long-accepted meaning, and there
are a number of tests you can perform to see if a given
display actually "resolves" an image to a given level. You
DO NOT get to go beyond the limit imposed by, for
instance, the phosphor triad size without introducing
artifacts (errors). In general, you're actually getting errors
well before you even approach that size (aliasing, color
and luminance errors, and so forth). Resolution is NOT
just a question of whether or not you still think the image
looks "acceptably sharp."

Bob M.
 
B

Bob Myers

Mxsmanic said:
And what can be done still beats flat panels, for now.

In some respects; in others, the CRT doesn't come close to
matching the LCD's performance, and hasn't for some time.
What is "best" is very much a matter of your personal preferences,
with respect to your use of these devices and the applications
and images you are dealing with. Which is all I have said, all
along, in the ongoing "CRT vs. LCD" debates.


Bob M.
 
M

Mxsmanic

Bob said:
They are just about exactly the same size as the physical pixels
of an LCD. Now, assume that the LCD in question provides
sub-pixel addressing (i.e., the image is not restricted to have its
"pixels" align with the pixel boundaries of the LCD, but rather
on the sub-pixel boundaries) - how is the effect of the "pixelated"
LCD screen on the image any different from the effect of the
phosphor triad structure? (This is not to say that the two images
have the same "look", but in terms of the limits on what can
be resolved, is there any real difference?)

Pixels (or subpixels) on a flat panel have a constant luminosity
throughout their dimensions. On a CRT, the luminosity will vary
depending on the colors of the adjacent pixels and the bandwidth of the
monitor. This can make aliasing less obvious on a CRT. I'm not saying
that this is good or bad, just that it happens.

It's interesting that ClearType makes text on an LCD look much better in
most cases, even though it makes the pixels "messier" with its low-level
antialiasing. I haven't tried ClearType on a CRT, so I don't know what
that does (I doubt that it works very well).
 
M

Mxsmanic

Bob said:
The latest Trinitron technology IS still exclusive; the only
other manufacturer to ever produce it was NEC (as it
appeared in the NEC-Mitsubishi monitors), and that was
under license from Sony. I don't know if Sony still holds
any patents that would prohibit someone else from making
ANY sort of aperture-grille product, but they certainly still
own the IP that covers that technology in its recent forms.

What recent changes have been made to Trinitrons? The original patents
expired in 1990, I think. I recall that the Trinitron was important
enough to merit a special technical Emmy award for Sony in 1973. Of
course, in those days Mr. Morita was still around ...

In any case, it always seemed to blow all the competition away.

I was considering the Diamontron for a time, but I understand that,
although it's an aperture grille like a Trinitron, it apparently just
doesn't have the quality of a true Trinitron. I guess that's all
becoming increasingly academic now.
Because of the high overhead necessary to continue to make
the basic technology (in this case, the CRT) in light of a rapidly
diminishing market. You don't think Sony and NEC have stopped
making these things just because they were tired of playing
in the professional market, do you?

I think they stopped making them out of misguided business decisions.
CRT plants ARE fairly expensive things to keep running, and it
simply is not feasible to run them for production runs that are
going to be measured in the tens of thousands per year, tops.

The vast majority of monitors being sold today are still CRTs. This is
even more true for television sets.
 
M

Mxsmanic

Bob said:
Sure - which means that you DID NOT fully resolve those
pixels.

Yes. But often important elements on the screen were composed of
multiple pixels, so the lack of clearly visible individual pixels wasn't
that much of a problem. And, unlike a flat panel, you can still see
pixels smaller than triads on a CRT screen--they are just a bit blurry
or partially resolved.
 
M

Mxsmanic

Bob said:
Sorry, you missed the important part of that: can you maintain 24-bit
accuracy in an analog system which fills a 200 MHz bandwidth,
in any current practical example?

I'm not sure what you mean by "24-bit accuracy." How many bits per
second?

You can always maintain at least the accuracy of the equivalent digital
system.

If you can push 200 Mbps through a digital channel, you can also get at
least 200 Mbps through the same channel with analog encoding (and
typically more). However, the analog equipment may cost more.
But so far, this is merely an assertion on your part; you have
not given any real theoretical or practical reason why this
should be so.

Information theory proves it.
Again, practical examples exist of "digital"
systems which come very, very close to the Shannon limit
for their assumed channels. So what's this basic, inherent
limit that you seem to be assuming for "digital" transmissions?

The basic limit is the fact that you declare anything below a certain
level to be noise. You thus sacrifice any actual signal below that
level, and in doing so you also sacrifice part of your bandwidth. You
don't make this arbitrary distinction in an analog system, so your
bandwidth is limited only by the _actual_ noise in the channel.
Not only used them, I've designed around them. There
is most definitely an upper limit on the resolution of ANY
CRT; it's just more obvious in the case of the standard
tricolor type.

What limits resolution in a monochrome CRT? Scanning electron
microscopes prove that electron beams can be pretty finely focused.
 
J

J. Clarke

Mxsmanic said:
I'm not sure what you mean by "24-bit accuracy." How many bits per
second?

You can always maintain at least the accuracy of the equivalent digital
system.

If you can push 200 Mbps through a digital channel, you can also get at
least 200 Mbps through the same channel with analog encoding (and
typically more). However, the analog equipment may cost more.


Information theory proves it.


The basic limit is the fact that you declare anything below a certain
level to be noise. You thus sacrifice any actual signal below that
level, and in doing so you also sacrifice part of your bandwidth. You
don't make this arbitrary distinction in an analog system, so your
bandwidth is limited only by the _actual_ noise in the channel.

This is one of the silliest arguments I have ever seen. Noise is not
signal. If your signal is dropping significantly below the noise threshold
then you've got a problem.

I think you have the properties of analog _measurement_ systems confused
with the properties of analog _communication_ systems.
What limits resolution in a monochrome CRT?

Generally the cost, but beyond that the grain size of the phosphor.
Scanning electron
microscopes prove that electron beams can be pretty finely focused.

At low energy levels in a cavity between multiple magnets.
 
J

J. Clarke

Mxsmanic said:
What recent changes have been made to Trinitrons? The original patents
expired in 1990, I think. I recall that the Trinitron was important
enough to merit a special technical Emmy award for Sony in 1973. Of
course, in those days Mr. Morita was still around ...

In any case, it always seemed to blow all the competition away.

I was considering the Diamontron for a time, but I understand that,
although it's an aperture grille like a Trinitron, it apparently just
doesn't have the quality of a true Trinitron. I guess that's all
becoming increasingly academic now.


I think they stopped making them out of misguided business decisions.

And of course you are a better marketing guy than the people at Sony. So why
ain't you running a 50 billion dollar corporation if you're such an expert?
The vast majority of monitors being sold today are still CRTs. This is
even more true for television sets.

That's odd, it was my understanding that this year LCDs outsold CRTs for the
first time. Or maybe that was last year.

Television sets are not computer monitors. And there are very few CRT
televisions that can display full HD.
 
M

Mxsmanic

J. Clarke said:
And of course you are a better marketing guy than the
people at Sony.

Time will tell.
So why ain't you running a 50 billion dollar corporation if
you're such an expert?

I haven't been hired for such a position, and I don't own such a
corporation myself. Then again, I have no ambition to do this type of
work, either.
That's odd, it was my understanding that this year LCDs outsold CRTs for the
first time. Or maybe that was last year.

Neither, as I recall.
Television sets are not computer monitors.

Same technology.
And there are very few CRT televisions that can display full HD.

So?
 
M

Mxsmanic

J. Clarke said:
This is one of the silliest arguments I have ever seen.

In other words, you disagree.
Noise is not signal.

Yes. But the definition of both is arbitrary.
If your signal is dropping significantly below the noise threshold
then you've got a problem.

Not if there is no noise obscuring it.
I think you have the properties of analog _measurement_ systems confused
with the properties of analog _communication_ systems.

Same thing.
Generally the cost, but beyond that the grain size of the phosphor.

And how does the grain size compare to LCD pixel sizes?
At low energy levels in a cavity between multiple magnets.

Not unlike a vacuum tube.
 
J

J. Clarke

Mxsmanic said:
Time will tell.


I haven't been hired for such a position,

Why not if you're so smart. What are you doing that pays better?
and I don't own such a
corporation myself.

Why not, o marketing genius? You being so much smarter than the people at
Sony and all it should be easy for you to build a business to that size.
Then again, I have no ambition to do this type of
work, either.

So what work do you have ambition to do? If it's anything in electronics,
don't quit your day job.
Neither, as I recall.

Well, I wouldn't expect you to get that right. Try googling "LCD CRT Sales"
and I think you'll find that your recollections is considerably in error.
Same technology.

So? Bicycle tires and car tires are "the same technology" but the fact that
steel-belted radials are making few inroads into the bicycle market does
not mean that they aren't selling well in the car market. The two markets
are different.

So they're clearly obsolescent if they can't display the current broadcast
standard.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top