flat panel monitors

B

Bob Myers

J. Clarke said:
Still, I would think that the $8000 price tag was more of an obstacle to its
widespread acceptance than any concern over the manner in which it displays
Windows icons.

Oh, it definitely was, as was the limited number of graphics
systems which could drive this beast.

But I've seen Windows displayed on it, and it ain't pretty...:)

Bob M.
 
J

J. Clarke

Bob said:
Only in that the topic of higher display resolutions
has arisen. No one today would consider buying a
raster printer of less than 600 dpi (@ 1 bit depth).
"Photo" quality is widely considered to be at least
200 dpi (24b).

It would probably surprise most people, if they ran
the numbers, to discover that the majority of available
displays are "only" 100 dpi. The industry takes no pains
to point this out, of course :)


Let me restate that. The limitations on what you can
sell into the retail market, in any economic quantity,
are set by MS. Computer artisans might well want to
have something closer to "photo" res on screen, but
if the GUI/dialogs are unusable ...

No, the limitation on what you can sell into the retail market, in any
economic quantity, is how low you can get the price. Right now, high res
displays cost more than lower res displays and larger displays with a given
res cost more than smaller ones, and this has a lot more to do with the
ability of the manufacturers of the panels to tune their production
processes than it does with Microsoft.
Specialty markets certainly exist. I'm speaking of
displays that CDW might offer.

Well, let's see. CDW offers Apple displays and Apples won't run any
Microsoft operating system.
Zounds. IBM doesn't make the technical details on the T221
very prominent on their web, so I didn't see that. My
guess, of course, is that they don't want affluent
but otherwise ordinary end users buying these things
and then discovering the gotchas (starting with: who
makes a quad-link card? Matrox?)

There's a whole list of boards that can drive it on the IBM site--all
workstation boards IIRC.
 
M

Mxsmanic

Bob said:
How is this arbitrary?

Because anything can be either information or noise, depending on how
you define information in your specific application.
It's fundamental to information theory -
signal is what you want, and noise is what you don't want.

Sure, but "what you want" is your own arbitrary decision, not something
that is decided by the theory itself.
 
M

Mxsmanic

Bob said:
While this is certainly true, it is of academic interest
only. There is no practical display technology which
challenges the limits of human spatial acuity in all
situations. There is most certainly nothing available
for desktop monitors which does this, given the fact
that it's not all that hard to put my eyes six inches from
the screen.

The assumption is normal viewing distance, which usually means a
distance no less than the diagonal of the image.

With an image that measures 20" diagonally, a person with perfect vision
can resolve about 340 ppi, which corresponds to a screen resolution of
5440x4080. In practice, though, you can divde this by 2 or 3, which
comes remarkably close to 1600x1200 for a 20" monitor. And many people
seem to prefer significantly lower resolutions, for some reason.
 
M

Mxsmanic

Bob said:
Again, a nice assertion, but there's no evidence,
theory, or reasoning to back it up.

Describe a non-analog display system.
True. But since the noise level cannot be zero even in
theory, this is about as meaningful in terms of discussions of
real-world systems as the time-honored frictionless surfaces
and massless pulleys of freshman physics.

One of the things it means is that the bandwidth of a channel is a
function of its S/N ratio. The greater you can expand that ratio, the
greater the bandwidth.

For example, a voice-grade telephone line that might only have 4800 Hz
of nominal bandwidth can carry much more than 4800 bps if you can get
the noise level low enough. Modems depend on this, which is how you can
reach 56 kbps on a voice-grade line, if it's quiet enough.
I must have missed the part where the phone company
magically removed noise from the standard telephone
line.

They haven't removed noise, but noise has declined considerably.
The standard voice subscriber line carries pretty much
the same bandwidth and noise specs today as it did thirty
years ago.

The same specs, but not necessarily the same noise.

For example, overseas telephone calls today are audibly less noisy than
they were decades ago, even though they have the same specs.
 
C

chrisv

Bob Myers said:
But this is not a short term trend. There is NO ONE in
the business today who expects the CRT to regain any
market share in the desktop monitor arena

Much as it saddens me, Bob is right. When you see companies like Sony
completely bailing-out of the CRT market, the writing isn't just on
the wall, it's written on your contact lenses, where it can't possibly
be avoided.
 
C

chrisv

Bob Myers said:
This year. Actually, several quarters ago. It was last
year if only certain markets (U.S., Europe, Japan) are c
considered. This year, the switch occurred in the worldwide
numbers.

Is that in number of units, or in dollars?
 
C

chrisv

Bob Niland said:
Depending on how you define "best", as we saw with the
early debates about CD audio. Now that purists can get
48-bit 96 KHz digital audio, I don't see that debate
anymore.

Even though 16 bits is all that is needed anyway, and even the 44kHz
sampling is really a non-issue with modern, high-quality DACS used in
conjunction with over-sampling and/or up-sampling.

I admit that 24-bits and 192kHz is a good marketing tool! Me? I'll
pass and stay with CD - remarkably in the middle, with one end of the
market going toward 24/192k and the other end going toward MP3. 8)

Vinyl? While some may prefer it's sound, it is NOT "better" than CD.
It's inferior in every measurable way.
 
B

Bob Myers

Mxsmanic said:
Bob Myers writes:

The assumption is normal viewing distance, which usually means a
distance no less than the diagonal of the image.

That's often a fair assumption, but you might be surprised how
various people use their displays.
With an image that measures 20" diagonally, a person with perfect vision
can resolve about 340 ppi, which corresponds to a screen resolution of
5440x4080. In practice, though, you can divde this by 2 or 3, which
comes remarkably close to 1600x1200 for a 20" monitor. And many people
seem to prefer significantly lower resolutions, for some reason.

The "screen resolution" (pixel format) you mention would be
correct if one is speaking of the human spatial acuity limit
for luminance information only (i.e., a "black and white"
image), and I assume you're using the common rule of thumb
of "60 cycles per visual degree" as that limit (it's not a hard
limit, but OK, that will do). The typical acuity in terms of
chroma information is significantly lower, which may be
what you meant in the "divide by 2 or 3" comment. And
while all of this analysis is correct, in terms of the ability to
distinguish the sort of details on which these original acuity
tests were run (basically, the ability to distinguish bright and
dark areas in a simple line pattern or better, a sinusoidal
variation), it doesn't necessarily mean that the typical viewer
will consider a 100-300 dpi image as "realistic." The issue
is considerably more complicated than that. This sort of
basic analysis IS useful in terms of determining practical limits
for display performance (i.e., the point beyond which additional
spatial information is likely not worth the cost), but it should
not be confused with saying that the display is at or beyond the
limits of human vision.


Bob M.
 
B

Bob Myers

Describe a non-analog display system.

Well, again, using the distinction between "analog" and
"digital" which I have been promoting all along, there are
a number of display systems which use what is effectively
a "digital" drive at the pixel level. The Texas Instruments
DMD (Digital Micromirror Device) comes to mind, and
it could be argued that most plasma displays would also
behave in this manner. You seem to favor more of the
common meaning of "digital" (in which "digital" essentially
equates to "binary"), and I would be curious to know how
you would say that such technologies are not "digital" in
that sense as well.

One of the things it means is that the bandwidth of a channel is a
function of its S/N ratio. The greater you can expand that ratio, the
greater the bandwidth.

Well, DUH. At least, "DUH" if you correct one error, and that
is to substitute "information capacity" for the commonly-misused
"bandwidth." ("Bandwidth," in its proper usage, has a very well-
defined meaning which has nothing to do with the noise level; it
is the range of frequencies occupied by the signal information, or
better, the range of frequencies over which a given channel can
deliver a signal without attenuating it to less than half its reference
or "midband" power level.) That's what I've been saying all along - you
DID look up the reference to Shannon, didn't you? To make
things simple, the Shannon formula for the capacity of any
band-limited channel in the presence of noise (read: "any
real-world channel") is

Capacity (bps) = BW * log2 (1 + S/N)

(Note: "log2" should be read as the base 2 logarithm; too
bad I can't do a subscript here.)

That rather clearly states that the capacity is dependent on
both bandwidth and the relative levels of signal and noise,
exactly as one would expect.
For example, a voice-grade telephone line that might only have 4800 Hz
of nominal bandwidth can carry much more than 4800 bps if you can get
the noise level low enough. Modems depend on this, which is how you can
reach 56 kbps on a voice-grade line, if it's quiet enough.

Here again, you confuse "capacity" with "bandwidth." A
voice-grade line ALWAYS has a minimum guaranteed
bandwidth of a given value (it's actually a lot closer to 3000 Hz
than 4800 Hz), AND it has an assumed typical noise level.
It is THOSE values which determine the capacity, per the above
formula. A modem does NOT magically reduce the noise
level on the line; instead, it adapts the information rate in use
to the conditions seen at the time (which is a large part of what
all that "negotiation" is before your dial-up connection is
finally established).
They haven't removed noise, but noise has declined considerably.

Nope. Well, OK, in some markets, it can be argued that
the average level of noise on the line is less today than it was
decades ago, an improvement which has come primarily through
the switch to digital transmission in much of the intermediate
path (with the quantization noise of the digital representation
significantly lower than the other noise sources previously
seen in these sections). But the improvement in modem performance
has not primarily come from improvements on the typical
voice line, but rather through the design of better ("smarter")
modems. There will not be significant advancement beyond the
current 56k level on standard lines, though, since that
performance is now very close to the Shannon limit, given
the standard line specs.

For example, overseas telephone calls today are audibly less noisy than
they were decades ago, even though they have the same specs.

No, they do not have the same specs. The "specs" in question
include minimum expected bandwidth AND the typical/maximum
permissible noise levels. I'm not talking about specs that you as
a customer would normally be seeing.

Bob M.
 
B

Bob Myers

chrisv said:
Is that in number of units, or in dollars?

Units. The crossover in terms of revenue happened some
time ago, due to the higher average selling price of the
LCD monitors.

Bob M.
 
M

Mxsmanic

chrisv said:
Even though 16 bits is all that is needed anyway, and even the 44kHz
sampling is really a non-issue with modern, high-quality DACS used in
conjunction with over-sampling and/or up-sampling.

Sixteen bits is probably enough, but 44.1 KHz is a bit tight.
Vinyl? While some may prefer it's sound, it is NOT "better" than CD.

It theoretically could be better, like any analog recording method. In
practice, it never is.

But most music is ruined when it is mixed, anyway, so it doesn't matter
how it is recorded.
 
M

Mxsmanic

Bob said:
... I assume you're using the common rule of thumb
of "60 cycles per visual degree" as that limit ...

I'm using 120 cycles. Thirty seconds of arc is the usual recognized
limit of acuity.

In other words, divide viewing distance by 6875, and that's the size of
the smallest pixel that can be visually resolved at that distance.
The typical acuity in terms of
chroma information is significantly lower, which may be
what you meant in the "divide by 2 or 3" comment.

I just meant that it requires really good viewing conditions to reach
the 30-second limit.
And
while all of this analysis is correct, in terms of the ability to
distinguish the sort of details on which these original acuity
tests were run (basically, the ability to distinguish bright and
dark areas in a simple line pattern or better, a sinusoidal
variation), it doesn't necessarily mean that the typical viewer
will consider a 100-300 dpi image as "realistic."

Perhaps, but I find that a very clean image displayed at that resolution
looks beautiful and extremely sharp. In most cases where the image
doesn't look sharp, it's because it isn't (small details are larger than
a single pixel).

For example, tonight I was amazed to see Apple Cinema Displays in a
store that were displaying really rotten, low-resolution JPEGs. How can
anyone appreciate the monitor if it's showing a fundamentally crummy
image? I'm surprised that stores don't find and display the biggest,
sharpest images they can get, in order to show monitors in the best
possible light.
 
M

Mxsmanic

Bob said:
Well, again, using the distinction between "analog" and
"digital" which I have been promoting all along, there are
a number of display systems which use what is effectively
a "digital" drive at the pixel level.

But all that really means is that the driving circuitry can't produce
pixels at more than a few discrete brightness levels.
The Texas Instruments
DMD (Digital Micromirror Device) comes to mind, and
it could be argued that most plasma displays would also
behave in this manner. You seem to favor more of the
common meaning of "digital" (in which "digital" essentially
equates to "binary"), and I would be curious to know how
you would say that such technologies are not "digital" in
that sense as well.

Display technologies typically can be driven in an analog (that is,
infinitely variable) way, but they aren't, when digital circuits are
behind them. DMDs are an exception, I think.
That's what I've been saying all along - you
DID look up the reference to Shannon, didn't you?

I read his famous paper long ago.
 
J

jjnunes

Mxsmanic said:
chrisv writes:
Sixteen bits is probably enough, but 44.1 KHz is a bit tight.

For playback only 16 bits is plenty. It's really <very> difficult to tell
the difference between 14 and 16 bits and often impossible. But 16 bits
gives some headroom. (or 'insurance' if you will)


Sample rate:

Some often claim that vinyl has better HF bandwidth, but in practice that's
not really true. With precious few exceptions, vinyl is truncated at around
15KHz, otherwise there is risk of burning out the very expensive cutter
head. Vinyl also has summed mono in the bass, otherwise the records would
be untrackable.

It theoretically could be better, like any analog recording method. In
practice, it never is.

As far signal integrity is concerned, vinyl is vastly inferior to CD. There
is simply no rational argument otherwise. Some folks prefer vinyl b/c some
of the euphonic distortions can sound pleasant and a skilled producer/recording
engineer can use those euphonic distortions to give the <illusion> of greater
overall fidelity due to the limitations of microphone pickup patterns,
loudspeaker colorations, technical limitations of soundfield reproduction
and so on. But it is very unpredictable, even discounting personal taste.

But most music is ruined when it is mixed, anyway, so it doesn't matter
how it is recorded.

Well, I prefer acoustic music, which ideally shouldn't be mixed, but like
with vinyl euphonic distortions, in some cases adding some artificial
reverberation or accent mikes can make for a more convincing overall result.
Judicious use of modern digital reverberation can really help sometimes.
BTW, when doing digital signal manipulation, 24 bits and a good noise
shaper are an absolute requirement.
 
B

Bob Myers

Finally getting caught back up on this after being computer-free over
the holidays...

Mxsmanic said:
But all that really means is that the driving circuitry can't produce
pixels at more than a few discrete brightness levels.

Not really, any more than the opposite would be true for
an "analog" drive system, but let's hold that for a moment...

Display technologies typically can be driven in an analog (that is,
infinitely variable) way, but they aren't, when digital circuits are
behind them. DMDs are an exception, I think.

That you believe "analog" equates to "infinitely variable"
(when it not only does not, but CAN not) explains much of the
difficulty we've had in this thread.

Bob M.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top