Adjusting an analog VGA LCD monitor (was LCD Question)

B

Barry Watzman

To properly adjust the dot clock frequency and phase of an analog LCD
monitor, download this test program:

www.winsite.com/bin/Info?500000030936

or (same site)

http://ns.winsite.net/bin/Info?500000030936

This program is variously known as CRTAT, CRTAT2, and CRT Align
(crtalign), and was written by Stephen Jenkins in about 1992 or 1993.
This is a very old Windows 3.1 program written in visual basic. It runs
under XP just fine, absolutely perfectly in fact, even with today's high
resolution monitors (you do need VBRUN300.DLL (the Visual basic version
3 runtime DLL library), which it may or may not come with the program
depending on where you download it from, but if you don't have
VBRUN300.DLL, it can be easily found on the web).

This program is totally non-invasive, it's "installation" makes NO
changes to your registry or to ANY system components or files. In fact,
if you just unzip the program and double click the exe file, it will run
fine without actual "installation" (but the program and the help file
need to be in the same directory, and VBRUN300.DLL needs to be available
in \Windows\System).

To use the program for this purpose, after installation, select the
leftmost of the 3 functions in the "Test" group (or "resolution" in the
drop-down menu) and then check both "mode" check-boxes.

When you display this pattern, you should see an absolutely perfect and
uniform field of alternating (but very, very fine) black and white
vertical bars each only one single pixel wide. If you see "moire"
distortion, or smearing, your display isn't adjusted correctly. Digital
monitors (with DVI interfaces) will always be "perfect". Analog
monitors will usually show an initial moire distortion pattern until
they are adjusted (dot clock frequency and phase). In most cases,
perfect adjustment can be achieved (and is "remembered" by the display),
but in some cases you can't achieve this. Note that the "auto"
(auto-adjust) function on almost all analog LCD monitors gets "close"
but usually does not get to the best possible adjustment.

[On many monitors, the dot clock frequency is called Horizontal size or
width. Phase is usually called Phase]

If you have an analog monitor and you don't use this program to adjust
your monitor, you are doing yourself a real disservice.

Two other comments:

First, you MUST run the video card only at the native pixel resolution
of the LCD panel. NO EXCEPTIONS OF ANY KIND ON THIS POINT, PERIOD. If
this makes things too small for your taste, DO NOT CHANGE THE
RESOLUTION, Windows has separate settings to make things bigger without
changing the resolution.

Second, poor quality video cables are a huge issue with analog LCD
monitors. The cable issue is self explanatory, but MOST of the analog
cables offered for sale are "poor quality". You can almost tell the
quality by the thickness of the cable. You want something significantly
larger than a number 2 pencil .... maybe even approaching the size of a
garden hose (there are 5 individual coax cables inside a good analog
video cable, and the larger their individual diameters, the lower their
loss and capacitance). Unfortunately, really good video cables are both
hard to find and expensive.
 
M

Michel R. Carleer

Another interesting program was provided (and maybe still is) with Philips
LCDs and is called FPAdjust.
It also allows you to properly adjust brightness and contrast or image
centering on the screen in addition to clock frequency and phase.
One important thing to remember: video clock frequency slightly changes with
temperature, so before making any adjustment wait for about half an hour
after switching on both the screen and the PC.

Barry Watzman said:
To properly adjust the dot clock frequency and phase of an analog LCD
monitor, download this test program:

www.winsite.com/bin/Info?500000030936

or (same site)

http://ns.winsite.net/bin/Info?500000030936

This program is variously known as CRTAT, CRTAT2, and CRT Align
(crtalign), and was written by Stephen Jenkins in about 1992 or 1993. This
is a very old Windows 3.1 program written in visual basic. It runs under
XP just fine, absolutely perfectly in fact, even with today's high
resolution monitors (you do need VBRUN300.DLL (the Visual basic version 3
runtime DLL library), which it may or may not come with the program
depending on where you download it from, but if you don't have
VBRUN300.DLL, it can be easily found on the web).

This program is totally non-invasive, it's "installation" makes NO changes
to your registry or to ANY system components or files. In fact, if you
just unzip the program and double click the exe file, it will run fine
without actual "installation" (but the program and the help file need to
be in the same directory, and VBRUN300.DLL needs to be available in
\Windows\System).

To use the program for this purpose, after installation, select the
leftmost of the 3 functions in the "Test" group (or "resolution" in the
drop-down menu) and then check both "mode" check-boxes.

When you display this pattern, you should see an absolutely perfect and
uniform field of alternating (but very, very fine) black and white
vertical bars each only one single pixel wide. If you see "moire"
distortion, or smearing, your display isn't adjusted correctly. Digital
monitors (with DVI interfaces) will always be "perfect". Analog monitors
will usually show an initial moire distortion pattern until they are
adjusted (dot clock frequency and phase). In most cases, perfect
adjustment can be achieved (and is "remembered" by the display), but in
some cases you can't achieve this. Note that the "auto" (auto-adjust)
function on almost all analog LCD monitors gets "close" but usually does
not get to the best possible adjustment.

[On many monitors, the dot clock frequency is called Horizontal size or
width. Phase is usually called Phase]

If you have an analog monitor and you don't use this program to adjust
your monitor, you are doing yourself a real disservice.

Two other comments:

First, you MUST run the video card only at the native pixel resolution of
the LCD panel. NO EXCEPTIONS OF ANY KIND ON THIS POINT, PERIOD. If this
makes things too small for your taste, DO NOT CHANGE THE RESOLUTION,
Windows has separate settings to make things bigger without changing the
resolution.

Second, poor quality video cables are a huge issue with analog LCD
monitors. The cable issue is self explanatory, but MOST of the analog
cables offered for sale are "poor quality". You can almost tell the
quality by the thickness of the cable. You want something significantly
larger than a number 2 pencil .... maybe even approaching the size of a
garden hose (there are 5 individual coax cables inside a good analog video
cable, and the larger their individual diameters, the lower their loss and
capacitance). Unfortunately, really good video cables are both hard to
find and expensive.
 
B

Benjamin Gawert

* Michel R. Carleer:
Another interesting program was provided (and maybe still is) with Philips
LCDs and is called FPAdjust.
It also allows you to properly adjust brightness and contrast or image
centering on the screen in addition to clock frequency and phase.
One important thing to remember: video clock frequency slightly changes with
temperature,

No, it doesn't. If it does the gfx card is just defective.
so before making any adjustment wait for about half an hour
after switching on both the screen and the PC.

That's only necessary when using a CRT, and even there not because of a
changing video clock frequency.

Benjamin
 
M

Michel R. Carleer

Heating clock chips and quartz crystals do change slightly in frequency.
That's pure plain physics. And for the LCD clock to become out of phase with
the card's clock, only a very very small frequency shift is sufficient.
Unfortunately, both the LCD and to a greater extent the graphics card do
indeed heat up at work.
Of course if you use a DVI interface this does not happen, as the DVI cable
transmits the video clock alongside the video signal so that the LCD
circuitry does not have to re-generate it.
 
B

Benjamin Gawert

* Michel R. Carleer:
Heating clock chips and quartz crystals do change slightly in frequency.
That's pure plain physics.

Today clock signals are generated via temperature-corrected PLL
circuits. None of the RAMDACs in ATI and Nvidia GPUs of the 8 years or
so have a pixel clock shift of more than a 1/100 Hz or so over the
complete temperature range.
And for the LCD clock to become out of phase with
the card's clock, only a very very small frequency shift is sufficient.

Well, no. TFTs (and CRTs) analog circuitry use a guided PLL to lock on
the pixel clock. Small variations in pixel clock just don't matter and
are corrected by the PLL circuitry.

This btw is different from TV sets where even slight frequency shifts do
cause noticeable distortion.
Unfortunately, both the LCD and to a greater extent the graphics card do
indeed heat up at work.

They do, but this doesn't affect the image. If it does then something is
not working as it's supposed to.
Of course if you use a DVI interface this does not happen, as the DVI cable
transmits the video clock alongside the video signal so that the LCD
circuitry does not have to re-generate it.

Sorry, but that's nonsense. DVI doesn't transfer a video signal. DVI
uses a digital bus (TMDS) to transport data streams. These data streams
also contain no video signal but bitmapped image data, divided in data
words. Totally different kind of beast.

Benjamin
 
M

Michel R. Carleer

Benjamin Gawert said:
* Michel R. Carleer:


Today clock signals are generated via temperature-corrected PLL circuits.
None of the RAMDACs in ATI and Nvidia GPUs of the 8 years or so have a
pixel clock shift of more than a 1/100 Hz or so over the complete
temperature range.

What you are telling me here is that the pixel clock of a ~100$ gfx has a
stability of the order of 1 part per 10 billion (pixel clock is of the order
of 100 MHz or even more). That is of the order of the stability of an early
standard atomic clock serving as a time reference.
Sorry, I don't believe your numbers.
Quartz oscillators are usually stable to 1 part per million, unless they are
specially made to serve as secondary time standards. Then they are stable to
something like 1 part per 20 million, provided they are placed in a
temperature controlled casing (or in a constantly worn wristwatch, the body
regulating the temperature in this case).
Well, no. TFTs (and CRTs) analog circuitry use a guided PLL to lock on the
pixel clock. Small variations in pixel clock just don't matter and are
corrected by the PLL circuitry.

At the expense of a change in the relative phase of the input and output
signals. See below why it is important.
This btw is different from TV sets where even slight frequency shifts do
cause noticeable distortion.


They do, but this doesn't affect the image. If it does then something is
not working as it's supposed to.

As I said, even a slight change in the frequency will affect the image
because you have to take into account the time taken by the receiving
circuitry to perform the analog to digital conversion. Of course the
complete conversion must take place during the time of one pixel, and not
extend to the next one. Even though the LCD PLL circuit will compensate for
changes in the frequency, the result is a change in the relative phase with
the possibility that the A/D conversion takes place during the transition
between two pixels.
Sorry, but that's nonsense. DVI doesn't transfer a video signal. DVI uses
a digital bus (TMDS) to transport data streams. These data streams also
contain no video signal but bitmapped image data, divided in data words.
Totally different kind of beast.

You have a very restrictive definition of a video signal. Talking about a
video signal does in no way mean that it is an analog signal. A video signal
is a signal that transmits video content, no matter if analog or digital.
Oh and by the way, the DVI-I standard transfers both a digital *and* an
analog signal. DVI-D only transmits digital.

Highly interesting discussion anyway.
Michel
 
B

Benjamin Gawert

* Michel R. Carleer:
What you are telling me here is that the pixel clock of a ~100$ gfx has a
stability of the order of 1 part per 10 billion (pixel clock is of the order
of 100 MHz or even more). That is of the order of the stability of an early
standard atomic clock serving as a time reference.
Sorry, I don't believe your numbers.

Then just go ahead and do the measures by yourself. Or if you have
access to them look at the specification sheets of modern GPUs. You'd be
surprised how accurate these cheap thingies are.
As I said, even a slight change in the frequency will affect the image
because you have to take into account the time taken by the receiving
circuitry to perform the analog to digital conversion. Of course the
complete conversion must take place during the time of one pixel, and not
extend to the next one.

Nope, it doesn't (see below why).
Even though the LCD PLL circuit will compensate for
changes in the frequency, the result is a change in the relative phase with
the possibility that the A/D conversion takes place during the transition
between two pixels.

Of course you're right with the relativ phase. However, jitter isn't
much of a problem. As analog input comes in it gets converted in digital
data (usually a simple sample & hold circuit). The data samples are then
feed into a field memory (field buffer) which contains the complete
screen content for one field/frame. Only after this memory has been
filled completely the content gets read out and passed to the display
logic. There is a certain time limit (depending on resolution) for the
memory to be filled, if this limit expires the circuitry waits for the
beginning of the next frame to start over.

Since TFTs are field-oriented displays jitter (within some limits of
course) doesn't cause any problems. If this jitter now becomes too big
the time limit for the field buffer expires before it gets filled and
nothing is passed to the display logic. If this jitter persists the TFT
won't show any image (but it probably will show a message like "unknown
signal" or "out of range").
You have a very restrictive definition of a video signal. Talking about a
video signal does in no way mean that it is an analog signal.

Of course not, a video signal can be analog or digital. But there are
several standards (i.e. like CCIR/ITU-R, ANSI etc) which define what a
video signal is. And packet-oriented transmissions like over TMDS are no
video signal but just a data stream (which of course could contain a
video signal but in case of DVI doesn't).
A video signal
is a signal that transmits video content, no matter if analog or digital.

Well, no. There are several important differences between an analog
signal and a digital bus that uses a packet mechanism more comparable to
Ethernet, especially in regard of jitter.
Oh and by the way, the DVI-I standard transfers both a digital *and* an
analog signal. DVI-D only transmits digital.

On DVI-I the four pins for the analog signal are the same as on a VGA
connector (same signal, different connector). And they are only there to
be able to use analog displays via DVI-VGA adapters or DVI-A cables
without having to waste additional space for separate VGA ports.
Highly interesting discussion anyway.

That's for sure.

Benjamin
 
M

Michel R. Carleer

Benjamin Gawert said:
* Michel R. Carleer:


Then just go ahead and do the measures by yourself. Or if you have access
to them look at the specification sheets of modern GPUs. You'd be
surprised how accurate these cheap thingies are.

I won't be able to measure it as I don't have a frequency meter with a
precision of 1 part in 10 billion at my disposal.
Nope, it doesn't (see below why).


Of course you're right with the relativ phase. However, jitter isn't much
of a problem. As analog input comes in it gets converted in digital data
(usually a simple sample & hold circuit). The data samples are then feed
into a field memory (field buffer) which contains the complete screen
content for one field/frame. Only after this memory has been filled
completely the content gets read out and passed to the display logic.
There is a certain time limit (depending on resolution) for the memory to
be filled, if this limit expires the circuitry waits for the beginning of
the next frame to start over.

Since TFTs are field-oriented displays jitter (within some limits of
course) doesn't cause any problems. If this jitter now becomes too big the
time limit for the field buffer expires before it gets filled and nothing
is passed to the display logic. If this jitter persists the TFT won't show
any image (but it probably will show a message like "unknown signal" or
"out of range").

You understand that I am not talking about jitter, but about very slow and
long term frequency changes due to temperature changes.
Of course not, a video signal can be analog or digital. But there are
several standards (i.e. like CCIR/ITU-R, ANSI etc) which define what a
video signal is. And packet-oriented transmissions like over TMDS are no
video signal but just a data stream (which of course could contain a video
signal but in case of DVI doesn't).


Well, no. There are several important differences between an analog signal
and a digital bus that uses a packet mechanism more comparable to
Ethernet, especially in regard of jitter.
Of course analog and digital signals are different. But for the DVI
standard, only video information is transmitted, so it is also a video
signal to me. But that's only a definition of words, not a fundamental issue
about frequency and phase long term stability.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top