1080p LCD to PC, cabling: VGA / SVGA / UXGA

D

Dennis

I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080

I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc

ie: http://www.infinitecables.com/vga.html

The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?

Any links or specific facts would be greatly appreciated.
 
C

Calab

Dennis said:
I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080

I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc

ie: http://www.infinitecables.com/vga.html

The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?

VGA is VGA. Unless you are going LOOOONG distances, you won't see a
difference.

You would probably see a difference if you switched to DVI/HDMI though,
since this is a digital signal.
 
F

Flyer

Dennis said:
I recently switched to using a 47" LCD as my monitor. It's 1080p so my
resolution is 1920x1080

I'm currently using a normal PC cable, the same one I used for my old
1024x768 monitor. Things look okay to me. But I note that there are
different kinds of PC cables (HD15 / Sub D) . There are some called
VGA, then there is SVGA and UXGA. Each is supposedly intended for a
different resolution, I think I read that the VGA one is intended for
600x480, the SVGA is intended for 800x600, UXGA is 1600x1200,
something like that.... etc etc

ie: http://www.infinitecables.com/vga.html

The questions: am I losing any video quality by using a normal PC
cable and not an UXGA one, or better still, a DVI to HDMI cable? Can
any VGA cable handle high resolutions?

Any links or specific facts would be greatly appreciated.

if your graphics card has DVI, use that with a DVI-HDMI converter and feed
it into the new monitor

P.
 
D

Dennis

VGA is VGA. Unless you are going LOOOONG distances, you won't see a
difference.

You would probably see a difference if you switched to DVI/HDMI though,
since this is a digital signal.- Hide quoted text -

- Show quoted text -


Can anyone confirm:

(1) It makes no difference which kind of VGA cable you use
(2) But using DVI/HDMI cable you would notice a difference vs VGA

Agree?
 
B

Benjamin Gawert

* Dennis:
(1) It makes no difference which kind of VGA cable you use

No, it doesn't make any difference if your monitor is just shit or if
you have an eyesight like Stevie Wonder. For all other situations the
quality of the VGA cable does make a difference, and a even bigger one
at high resolutions (over 1280x1024). Analog signals like VGA are quite
sensible to signal attenuation, xtalk and other funny effects that RF
signals usually suffer from. A crap cable will very likely lead to a
blurry picture missing sharpness, a good one won't.

But as with all analog signals, it also comes down on how good the
signal that comes from your gfx card is. Most newer gfx cards have an
average to awful analog signal quality because the manufacturers save a
few bucks on the analog output filters on the card.
(2) But using DVI/HDMI cable you would notice a difference vs VGA

Agree?

Yes.

But besides the cabling issue you should also check that your LCD TV
supports disabling of overscan as this leads to a blurry picture as
well. And no matter what the manuals say the HDMI inputs are capable of,
use the *native* resolution of your LCD panel. If your TV is "HD ready"
and uses a 1366x768 panel then use this resolution and not 1920x1080 as
the latter one will be downscaled which also looks blurry on desktop
applications.

Benjamin
 
P

Paul

Dennis said:
Can anyone confirm:

(1) It makes no difference which kind of VGA cable you use
(2) But using DVI/HDMI cable you would notice a difference vs VGA

Agree?

1) The old scheme for PC monitors, used status pins on the interface,
to help the monitor signal a particular resolution to the PC. Apple
used a similar scheme, on their video interfaces.

http://www.monitorworld.com/faq_pages/q17_page.html

2) Modern display devices use a DDC interface. That is a serial digital
interface, with signal names like SCL and SDA. One signal is a clock
and the other signal carries data.

http://martin.hinner.info/vga/pinout.html

3) If there are working SCL and SDA pins and wires on the cable,
then a utility like this one, can display what the monitor
is sending to the computer (the info is used by the video driver).

http://www.entechtaiwan.com/util/moninfo.shtm

If you go to the store today, and buy a VGA cable, at a minimum it
should have RGBHV to handle the analog video signals and sync
signals. It should have the two wires for SCL and SDA. Those
support the basics.

There are cables, where there is no room for SCL and SDA. For example,
my old CRT monitor, uses one of these cables. It was a beautiful
monitor in its time, a Sony Trinitron, supported multisync, but
since it used this cable, there was no plug and play with this.
Your situation is unlikely to be using a cable like this.
Cabling schemes like this, may continue to be used on
projection TV devices.

http://www.monitorworld.com/m_images/Page_graphics/bnctovgaphoto.jpg

Modern VGA, DVI, and HDMI all have DDC serial interfaces on them,
and that means that the cable design can be generic. There shouldn't
be several different flavors of VGA cable needed now.

HDMI - SCL and SDA pins are listed.

http://en.wikipedia.org/wiki/Hdmi

DVI - In the picture here, the signals are called DDC Clock and DDC Data

http://en.wikipedia.org/wiki/Digital_Visual_Interface

VGA - In this article, there is no mention of the older "sense" definitions
of the pins. Just the SCL and SDA.

http://en.wikipedia.org/wiki/VGA_connector

As for signal quality, the failings of the cables happen different ways.

On VGA (analog), the signal environment is 75 ohm coax for the RGB.
If there are reflections, problems detecting the sync signals cleanly,
there can be visual artifacts, like ghosting, or a blurry picture.
And, as the cable becomes longer, then only the lower monitor resolutions
are sharply rendered. So if I run a 100 foot VGA cable, I might only
get a sharp picture at 1024x768. If the cable is 6 feet long, perhaps
I can run 1600x1200 and expect it to be nice and sharp.

DVI and HDMI use the same method of transmission (the interface standards
have a lot in common). The interface is digital and is differential
(uses two wires D+ and D-, to carry one information stream). Now, as you make
the cable longer on that one, at first there is no visual artifact at all.
Digital doesn't degrade if the signal path is working in good conditions.
So at least small increases in cable length have no effect on
resolution choices or anything else for that matter.

But as the cable gets longer, the digital signal is attenuated by the
run of cable. Eventually the signal is too small to be detected well.
What appears on the screen is "colored snow". Each miscolored dot on
the screen represents a digital transmission error, due to poor
signal quality. The display will be "sparkly", because the position
of each errored dot is rather random.

Sometimes the snow happens even with a short cable. Some people have
bought new equipment, and the cheap bastards include an inferior
HDMI cable with it. Changing the cable to a higher quality one,
may fix the snow problem (of course, the price you pay for the
replacement cable, may make your jaw drop). The Wikipedia HDMI
article suggests what range of cable lengths can be reasonably
expected to work.

With HDMI and DVI, the resolution and refresh rate, affect the
data rate sent digitally across the cable. So in fact, when
"snow" happens, it is possible that the resolution and refresh
rate will affect the snow. To give you some idea just how fast
those signals are, when someone says the "clock" on the cable
is 165MHz, the differential data is actually going across the
cable at 1650Mbit/sec (ten bits are sent per "clock cycle").
Effectively about as fast as SATA I, only involving longer cables.

Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top