Hyundai L90D+ -- DVI problem??

  • Thread starter Matthew.DelVecchio
  • Start date
M

Matthew.DelVecchio

hello,

i just got the Hyundai L90D+, which comes w/ its own dvi cable --
kickass! however, i cant get it to work -- suck.

is there anything special to it? i just plugged it in the panel, and
the other end into my vid card.. yet, the monitor is reporting no
signal.

my vid card is an nvidia geforce4 TI 4400. i checked the display
adapter settings and didnt see any settings for enabling the dvi port.
im running 43.25 drivers, so im going to install the 71.89 and see if
that helps.

analog signal works great. but id really like to get the 100% digital
solution going.


thanks!
matt
 
M

Matthew.DelVecchio

got it. perhaps it was the software, but i installed the latest version
of the nvidia drivers and then i saw it listed. who knows!

seems cool. hard for me to compare the two, as the input toggle is so
buried so deep in the ackward controls.


matt
 
D

DW

I have the same monitor. The difference is basically one of clarity;
with the DVI input, you're seeing the pixels as they 'should' be seen.
They're precise, and square.

Some people don't like this because it looks jaggier than they'd like;
I know people who prefer the analog cable, as it makes things look
slightly smoother.

I'm curious: how consistent is the brightness on yours? Could you track
down an app called Dead Pixel Buddy, and give it a run? Mine's
_slightly_ darker on the edges, moreso along the right. But it's
unnoticeable unless the screen is almost all a solid color.

DW
 
B

Bob Myers

DW said:
I have the same monitor. The difference is basically one of clarity;
with the DVI input, you're seeing the pixels as they 'should' be seen.
They're precise, and square.

Some people don't like this because it looks jaggier than they'd like;
I know people who prefer the analog cable, as it makes things look
slightly smoother.

Ummm...the pixels on the panel are "square" and at a single
value no matter what; it is not possible to have shading or
blur across a given pixel. Unless you're seeing the result of
unstable or otherwise improper sampling (which would cause the
sampled pixel values to change from one frame to
the next, possibly resulting in a flickering appearance on edges),
just how would the analog input make things look "slightly
smoother" on a discrete-pixel display like an LCD?

Bob M.
 
D

DW

I hear what you're saying, and I agree that the pixels can't be
partially shaded or blurred. As I say, this is 3rd-party, anecdotal
stuff. But in theory, I see it as similar to other translations from
digital to analog - be it the transfer of signal from a video adapter
to a monitor or recording music from master to a CD. Your endpoints
(video adapter / LCD monitor) may digital in each case, but any analog
conversion between the two (IMHO) _must_ result in some degradation in
quality, however marginal that may be.

LCD Monitors are digital, but is it not the case that when you're using
VGA on an LCD the signal is transformed to analog? And if so, would
there not be some necessary loss in fidelity?

Myself, I went from a 19" CRT via VGA to a 19" LCD via DVI. Very
apples-and-oranges.

But in response to your point, there must be a quantifiable reason for
having DVI in the first place, some better clarity or preservation of
original signal or something (and which must be borne out in the
visible image on the display), which VGA does not allow for - otherwise
there'd be no reason to use it.

DW
 
B

Bob Myers

DW said:
I hear what you're saying, and I agree that the pixels can't be
partially shaded or blurred. As I say, this is 3rd-party, anecdotal
stuff. But in theory, I see it as similar to other translations from
digital to analog - be it the transfer of signal from a video adapter
to a monitor or recording music from master to a CD. Your endpoints
(video adapter / LCD monitor) may digital in each case, but any analog
conversion between the two (IMHO) _must_ result in some degradation in
quality, however marginal that may be.

LCD Monitors are digital, but is it not the case that when you're using
VGA on an LCD the signal is transformed to analog? And if so, would
there not be some necessary loss in fidelity?

Surprisingly, the answer is no.

First, LCDs are not, contrary to a popular misconception, fundamentally
"digital" devices, although the panel-level interface most typically is.
(Not always, though - there have been monitors that preserved the
video in analog form through to the pixel level.) The basic drive of
the LCD subpixels is analog (a varying voltage applied to the
column drive lines sets the "gray level" of the subpixels). It's true that
LCDs and other flat-panel technologies are fixed-format devices,
meaning they have a discrete X-Y array of pixels, but that's not the
same as being "digital." It just means that you need an accurate
sampling clock to assign the video information to the proper location
(which is true for either analog or digital video - try running a "digital"
interface with no clock, and you'll see what I mean).

Digital signals can generally be communicated with no loss of
fidelity, IF the discrete levels which define the various bits of
the transmission are guaranteed to always be above the noise
level. (Generally, this is not perfectly met, which is why most
digital transmission protocols include error detection at least,
if not error correction.) But there is not NECESSARILY a loss
of information in making the transition from "digital" to "analog"
forms. If, as is most often the case with digital interfaces for
monitors, there's only 6 or 8 bits per color per pixel to begin
with, it's not all that difficult to maintain the requisite degree of
accuracy in the analog domain. While it's true that any analog
transmission channel inevitably adds noise to the information,
this does not affect the accuracy of the recovered information
as long as the cumulative effect of the noise remains below the
effective LSB value for the information being conveyed.

Myself, I went from a 19" CRT via VGA to a 19" LCD via DVI. Very
apples-and-oranges.

Agreed. Oddly enough, purely digital-interface CRT displays
have been tried (and might have eventually managed to get
some market share, had it not been for the fact that CRT
monitors in general have been displayed by LCDs and
eventually other technologies.

But in response to your point, there must be a quantifiable reason for
having DVI in the first place, some better clarity or preservation of
original signal or something (and which must be borne out in the
visible image on the display), which VGA does not allow for - otherwise
there'd be no reason to use it.

Yes, there are reasons for digital interfaces, just as there are
reasons for analog interfaces. I spent quite some time serving
on the committee which developed and supported the DVI
specification in the first place, so I'm not "anti-digital" at all.
I'm just trying to point out that the reasons for using either of
these sorts of interfaces aren't quite as clear-cut as many think,
and often are completely different from those we might first
imagine.

In the case of digital display interfaces, right now the leading
reasons driving what adoption has happened (beyond the
perception, often in error, of improved image quality) have to
do with the relative ease of integrating digital outputs into the
graphics controllers, the ability to more easily integrate audio
and other supplemental information into the video interface
(without requiring additional conductors, etc.), and the ease with
which content-protection systems can be applied to a digital
data stream. The disadvantages of the current digital interfaces,
at least, include relatively limited data capacity (bandwidth)
vs. a decent analog connection (which makes sense - look at
how many bits you get to transmit in a given time period for
an "analog" signal vs. one that can only be in one of two states)
and generally some problems in transmission over longer
distances with at least acceptable results. Up to at least the
UXGA format, even the basic VGA interface can provide very
good image quality, and could be improved even farther with some
simple techniques that have already been described in various
industry standards.

Bob M.
 
D

DW

Well - fascinating read - a little low-level for my wittuw bwain to
absorb it all, but it does put some things in perspective.

Thanks for that!

DW
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top