A
Adam Russell
I've heard that flat panel monitors are easier on the eyes. Is this true?
If so, then what would be a good choice for a 17" for gaming?
If so, then what would be a good choice for a 17" for gaming?
I've heard that flat panel monitors are easier on the eyes. Is this true?
If so, then what would be a good choice for a 17" for gaming?
John Oliver said:just haven't decided yet if having a DVI interface is worth another $100
Michael said:No question, I wouldn't buy one without it.
Adam said:I've heard that flat panel monitors are easier on the eyes.
Is this true?
Mxsmanic said:No question, I wouldn't buy one without it. [DVI-D]
Is there really that much difference?
Bob said:There is noise resulting from taking the original
digital raster to analog and back to digital.
This might display, for example as horizontal
artifacts, unstable picture regions, etc.
Square waves? No chance. Think of a pattern of
alternating white/black 1-pix dots. In analog,
these need to exhibit sharp transitions and flat
tops to emulate what you get for free with DVI-D.
Bandwidth limits in the analog channels are apt
to smear this fine detail.
Group delay with analog introduces some risk that
the pixel data won't exactly precisely align with
the LCD triads upon reconstruction.
Suppose the
analog signal has a little group delay (time shift)
from the DAC, or in the cable, or in the ADC (or
just one of the colors does). Our hypothetical white
and black dots might become a gray moire morass.
I just compared the same screen in HD15 analog and
DVI-D digital on my LCD, and the analog image has
less "contrast", text characters are not as sharp,
and due to the grayscale tracking limits of this
monitor, black characters on white backgrounds have
a tiny annoying pink fringe in analog.
Go DVI-D. By the way, expect images and text to perhaps
be startlingly sharper until you get used to it.
The limitations of analog were providing some
full-screen anti-aliasing at no extra charge.
Mxsmanic said:But the panel is just doing an analog to digital
conversion, anyway, ...
... and the connection is analog even when it's DVI-D,
Doesn't the panel correct the time base for incoming
analog signals or something, in order to avoid this?
I don't know if my video card provides it.
Bob said:No. With a DVI-D connection, the discrete pixel digital
values are preserved from creation in the frame buffer
by the graphics driver all the way out to the individual
pixel drivers for the LCD triads.
Most cards provide DVI-I ports, which have both one link
of DVI-D digital and RGB analog (sometimes called DVI-A,
plus a breakout-to-HD15 cable for analog use). By DVI-D,
I mean use the card's DVI port, and a DVI cable, and
assure yourself that if both signals are present, the
monitor is using the digital, and not the analog.
How many levels per pixel?
My card already generates 32-bit color, although my
flat panel can't use all that resolution.
I'll look again and see if there's a DVI-D plug,
but I rather doubt it.
Bob said:I haven't looked at the DVI spec since 1.0, but at
that time, single-link DVI was limited to 24 bpp,
or 8-bits per R, G or B.
It's not clear to me that contemporary LCD panels can
even deliver 24-bit color. They accept such signals,
but what they paint on screen is another matter.
The DVI(-I) connector is a D-sub, usually white body,
with an 8x3 grid of pins, plus a 2x2 grid with
cruciform ground planes for the RGB. If the
connector is DVI-D (digital only), it omits the
2x2 grid array and has only one of the ground blades.
If your card only has 15-pin Dsub(s) (usually blue),
then it only has analog video out. You cannot use a
pure digital connection, athough you could invest in
a monitor with DVI, and use it in analog mode until
your next graphics card upgrade.
Mxsmanic said:But it worries me that the [DVI] standard
apparently was designed in a way that
permanently limits it to 24-bit color.
That doesn't sound familiar at all.
Bob Niland said:No question, I wouldn't buy one without it. [DVI-D]Is there really that much difference?
Yes. Several considerations:
There is noise resulting from taking the original
digital raster to analog and back to digital.
This might display, for example as horizontal
artifacts, unstable picture regions, etc.
Square waves? No chance. Think of a pattern of
alternating white/black 1-pix dots. In analog,
these need to exhibit sharp transitions and flat
tops to emulate what you get for free with DVI-D.
Bandwidth limits in the analog channels are apt
to smear this fine detail.
Group delay with analog introduces some risk that
the pixel data won't exactly precisely align with
the LCD triads upon reconstruction. Suppose the
analog signal has a little group delay (time shift)
from the DAC, or in the cable, or in the ADC (or
just one of the colors does). Our hypothetical white
and black dots might become a gray moire morass.
Mxsmanic said:But the panel is just doing an analog to digital conversion, anyway, and
the connection is analog even when it's DVI-D, so doesn't it all just
wash?
The image on mine is really sharp, it seems, and contrast is excellent.
No artifacts even under a loupe. It makes me wonder how much better it
could get.
Doesn't the panel correct the time base for incoming analog signals or
something, in order to avoid this? Like the TBC in some video
equipment?
No. With a DVI-D connection, the discrete pixel digital
values are preserved from creation in the frame buffer
by the graphics driver all the way out to the individual
pixel drivers for the LCD triads.
TMDS as I recall. Transition-Minimized Digital Signalling.
Ones and zeros. Everything is either a 1, a 0, or ignored.
I'd like to think so, but I wouldn't assume it.
Clearly, when we feed the monitor a non-native res,
it cannot match pixels, because the rasters don't map.
Mxsmanic said:How many levels per pixel? An analog connection can have any number of
levels, depending only on the quality of the connection and hardware.
My card already generates 32-bit color, although my flat panel can't use
all that resolution.
I'll look again and see if there's a DVI-D plug, but I rather doubt it.
24-bit color isn't good enough for some applications. It doesn't
provide enough levels in some parts of the gamut, such as blue (the eye
is extremely sensitive to differences in intensity in the blue end of
the spectrum, so any kind of posterization is very easy to spot).
True for many cheaper CRTs, too. But it worries me that the standard
apparently was designed in a way that permanently limits it to 24-bit
color.
Bob Myers said:Yes. Several considerations:No question, I wouldn't buy one without it. [DVI-D]
Is there really that much difference?
Bob, as much as I hate to disagree with you, ...
For the most part, the differences between an analog
and a digital interface for LCD monitors come down to
questions of pixel timing, which really have nothing
at all to do with whether the video information is in
digital or analog form.
(And please consider how truly awful the digital
interface would be if the pixel clock information
were removed from it - it would be totally unusable.
Nope; all of the above have to do with the timing of
the pixel sampling process, not with noise in the video.
(Oddly enough, the LCD is NOT inherently a "digital"
device as is often assumed - fundamentally, the control
of the pixel brightness in any LCD is an analog process.
If we were talking about a display that actually shows
those edges, you'd have a point - but the LCD doesn't
work that way. Remember, we are dealing with a
SAMPLED analog video stream in this case; if the sample
points happen at the right time (which again is a question
of how well the pixel clock is generated), the pixel values
are taken right "in the middle" of the pixel times - making
the transitions completely irrelevant.
... the New Analog Video Interface standard, or simply
NAVI. ... It's not clear yet how well NAVI will be
accepted in the industry, but it IS available if
anyone chooses to use it.
Bob Myers said:In either an analog or digital interfaced LCD monitor,
there is typically a look-up table in the monitor's
front end which converts these values into somewhat
different ones, in order to correct for the rather
S-shaped (as opposed to a nice CRT-like "gamma" curve)
Bob said:Here's a lousy photo of a bulkhead with both
DVI and HD15:
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.