LCD TV as Monitor?

D

David Maynard

Conor said:
Its what I found on Google.

Ah. Well, that the encoding resolution of a DVD but they're sampling above
what NTSC can actually do, and then there's the matter of what a TV can do.

NTSC channel bandwidth is 6Mhz and the duration of a horizontal line is
63.556 microsec. However, that includes sync with the potentially visible
portion being 52.66 microsec (I say potentially because TVs overscan to
mask alignment errors so you don't see black bands on the screen edge).

Now, a "Hertz" is a swing from one polarity to the other and back so, if we
make a loose translation to 'pixels', that corresponds to 2 pixels, one
light and one dark, then the next cycle could be light and dark again, etc.
(in TV terms they talk about "lines" [one being light and the other dark]
of resolution because that takes into account horizontal scan jitter [it
won't be a vertical line if they don't line up all the way down the screen]).

Dividing it out, roughly 12 million pixels per second, max, over the 52.66
microsec visible portion is about 632 pixels and the number usually gets
rounded to 640.

A similar consideration of the vertical blanking interval gives a vertical
of 480-486 lines (pixels) out of the 525 total.

That's interlaced, however, and TVs have a difficult time getting every
line to interlace 'perfectly' in-between the previous frame's lines so the
visual effect is usually less than the 480. But, for the moment, let's say
it's perfect so the theoretical capability is 640x480, in MONOCHROME, I.E.
black and white.

Color (difference) information is quadrature modulated on a 3.58 MHz
subcarrier that's placed inside the monochrome signal and the strange
frequencies, such as 60Hz vertical not being 60Hz but rather 59.94, was
done so that the color subcarrier information looks like perpetually
shifting 'random noise' in the B&W picture and, as a result, it becomes
'invisible' to the eye because it doesn't persist in any location long
enough for the eye to notice it.

However, the bandwidth of the subcarrier color information is extremely
limited and nowhere even close to 640x480. It only appears reasonable
because the human eye is much more sensitive to intensity variations than
it is to color change and if you get up close to a TV picture you will see
that color 'smears' all over the place. It just looks decent, at normal
viewing distances, because your eye keys off the B&W content. I.E. if Billy
Bob is wearing a red jacket against a blue sky your eye tells you the red
of his jacket ends where the blue sky begins because it can 'see' the B&W
intensity change between his dark jacket and the bright sky but it you put
your eyeball up to the screen you'll see the red and blue terribly smeared
across the line of demarcation.

So, saying 640x480 resolution is wholly inappropriate when talking about a
COLOR picture.

Now, to get to what a TV can do we need to note one other thing, the sound
information is placed on a 4.5MHz carrier in the 6MHz video bandwidth. The
point being, we've got what the NTSC signal is *made* of but the TV has to
untangle all that stuff back out of it.

Unfortunately, the idea of the color chroma subcarrier looking like 'noise'
on a B&W set doesn't work on a color one because that 'noise' IS the color
information and would interact directly with the decoded version of it on
the screen. The 'simple' way, which early TV receivers used because complex
electronics was expensive, is to simply say to hell with it and roll off
the monochrome signal at 3MHz, under the 3.58MHz subcarrier, and feed the
upper part to the color decode circuitry. That, however, cuts the
monochrome bandwidth in half to 320 pixels per line.

The fancy way is to use some kind of complex filtering, now that
electronics is 'cheap' and we can pop complex ICs all over the place, to
'comb' out the 3.58MHz subcarrier, leaving in the rest. And that's what's
typically used now days, a "comb filter." It's not perfect either but it's
a hell of a lot better than rolling it off at 3MHz. And then we've got to
get the 4.5MHz sound of of it too.

The point is, you're going to loose some resolution off the theoretical
maximum, which isn't all that great to begin with, no matter what mechanism
you use to decode it.

I.E. NTSC, and TVs, stink as computer monitors.

So what can one 'gain' with a PC and a tuner card? Not much in the way of
the tuner because it's got to decode that lousy broadcast NTSC signal. A
'better' signal just ain't there.

What you *can* gain gets back to your 720x480 DVD encode. Assuming it's
encoded better than 'broadcast' you have better video than could come
through an NTSC signal. So now the problem is, what to display it on?

A computer monitor would be great as it has much better resolution than an
NTSC TV and the signal in the PC it can go straight to it uncorrupted.

The other alternative is to feed it to a TV/monitor with better resolution
than a 'normal' TV using an interface that is better than NTSC, like
component video. (Of course, a standalone DVD player could do the same thing)

You will even gain a slight amount, for DVD/VCD, using S-video, or even
composite video, because it doesn't have to go through the RF
modulation/demodulation phases but it's going to suffer from color chroma
subcarrier modulation and stripping just like broadcast video because
that's how NTSC works. Where you can potentially gain is if they don't
strictly limit to a 6Mhz bandwidth because you're on wire and don't have to
meet FCC channel assignments.

But there's not much you can do with a tuner card because it's dealing with
the same signal a TV tuner is. Unless you get into post processing filters
but, even then, you can't get more resolution than is in the signal to
begin with.
 
C

Conor

That's for DVD. TV broadcast in North America is low res until we all
have HDTV. I believe the UK has the highest broadcast TV res in the
world.

http://www.videohelp.com/forum/userguides/94382.php
720 X 576 (480 NTSC). Used by most DVD.

http://members.aol.com/ajaynejr/vidres.htm

http://members.aol.com/ajaynejr/vidcolor.htm

NTSC Broadcasts (composite) ? 120 lines best, 40 - 48 lines typical
for reddish orange and greenish blue; 40 - 48 lines for most other
color transitions.
Cheers for that. It made no mention of being DVD.
 
F

Fisher

Unless you get into post processing filters
but, even then, you can't get more resolution than is in the signal to
begin with.

Yea, and this is what the Viewsonic N6 does. It doesn't make much
difference but it does make a difference.
 
J

JAD

Fisher said:
That's for DVD. TV broadcast in North America is low res until we all
have HDTV. I believe the UK has the highest broadcast TV res in the
world.


HDTV is all but dead...............
 
T

T Shadow

well that's what I want to do as well. I sold my TV
awhile back in effort to downsize a bit

I want to use my PCs as a "TV"

Hence the question abt using an internal TV tuner such
as the Hauppauge cards

But I assumed an LCD display would be fine for that

Matter of fact I was thinking abt buying the big Dell
24" wide screen LCD display for this purpose

I have several computers with TV cards. Two of them now have LCD monitors. I
would suggest getting a monitor with at least a 700:1 contrast ratio.
Otherwise you'll constantly be adjusting contrast/brightness.
PQ of the NTSC signal starts to get ugly when you go above 1024X768 in a
standard aspect monitor. You really won't get any benefit, as far as the TV
goes, from a higher resolution unless you get a Digital TV card.
YMMV
 
M

me

I set up a Sony HS95P 19" LCD monitor recently with a WinTV card.

How you liking that monitor so far?

I like the design of the new Sony LCD monitors with the
adjustable foot and all
 
M

me

The Viewsonic N6 allows you to watch TV on your monitor without having
your computer on so no computer noise while watching TV. It also does
16:9 ratio too.

Are you happy with the unit above?
 
D

David Maynard

Any advice on what TV/monitor above?

As I don't have one, no. Look for one that explicitly states it is a
TV/Monitor and gives the supported resolutions.
 
S

Sail

You guys are confusing the lines of scanning (or for digital displays
the lines & columns of pixels) with the *resolution* a display is
capable of showing (or media capable of delivering).

The digital ratio of pixels for NTSC is 720 across and 480 down.
For analog it is 525 lines scanned horizontally, of which about 490 are
visible. For analog PAL (European standard) it is 625 scanning lines, of
which about 560 or so are visible.

RESOLUTION is a different latter altogether. Vertical resolution means
how many vertical lines can be discerned when scanned horizontally.
(Think about it. you have a bunch of vertical lines. It is the
horizontal scan which will discern between them.) The number of
"vertical lines of resolution" therefor has nothing to do with the
number of pixels arranged vertically.

The resolution of even analog signals is quite goo if viewed on the
master tape, played off a time base stabilized vcr, viewed on a
broadcast control monitor (those used to cost about $5,000). But by the
time a signal is passed to affiliates, via cable or satellite,
rerecorded on inferior equipment, played back again and piggybacked on a
radio frequency, passed through a transmitter, received by an antenna at
your home (or by one at your cable company, then passed through miles of
cable with dozens of signal amplifiers)and finally received by your
cheap home tv and decoded from a radio frequency back to a video
signal... it is amazing we see anything at all. It certainly does not
match what we see when working with the Master.
 
D

David Maynard

You guys are confusing the lines of scanning (or for digital displays
the lines & columns of pixels) with the *resolution* a display is
capable of showing (or media capable of delivering).

Not so.
The digital ratio of pixels for NTSC is 720 across and 480 down.

There is no 'digital ratio' for NTSC broadcast video (the topic for
tuners). It's an analog system.

What you're describing is, as was said, the DVD encoding and that isn't the
same thing.
For analog it is 525 lines scanned horizontally, of which about 490 are
visible. For analog PAL (European standard) it is 625 scanning lines, of
which about 560 or so are visible.

The electron beam is going horizontally back and forth to make the
horizontal scan lines but how many of them it stacks going from top to
bottom is the vertical resolution (at best). And NTSC does 525 of them by
the time it goes from top to bottom (twice: 252.5 each half interleaved
with the other half).

RESOLUTION is a different latter altogether.

You've got it switched.
Vertical resolution means
how many vertical lines can be discerned when scanned horizontally.

No. "Vertical resolution" is how many scan lines there are, assuming
interleave is working perfectly.

How many "vertical lines can be discerned" is the horizontal resolution.
(Think about it. you have a bunch of vertical lines.

No, you think about it. 'We' didn't have a "bunch of vertical lines" till
you decided to claim that's what we meant when we didn't.
It is the
horizontal scan which will discern between them.) The number of
"vertical lines of resolution" therefor has nothing to do with the
number of pixels arranged vertically.

Only because you're jumbling the words midstream. No one meant "vertical
lines" as in lines going up and down on the screen. They meant the vertical
resolution is how many scan lines there are.

The resolution of even analog signals is quite goo if viewed on the
master tape, played off a time base stabilized vcr, viewed on a
broadcast control monitor (those used to cost about $5,000). But by the
time a signal is passed to affiliates, via cable or satellite,
rerecorded on inferior equipment, played back again and piggybacked on a
radio frequency,

And there is the problem because the signal is encoded NTSC to get on that
radio frequency and NTSC bandwidth, hence the available resolution when
decoded, stinks.
passed through a transmitter, received by an antenna at
your home (or by one at your cable company, then passed through miles of
cable with dozens of signal amplifiers)and finally received by your
cheap home tv and decoded from a radio frequency back to a video
signal... it is amazing we see anything at all. It certainly does not
match what we see when working with the Master.

All of which is moot because you can't GET the 'master signal' with a PC
tuner card. It's stuck getting the same crap signal a TV is.
 
S

Sail

Dave, I have read your posts on the IDE Channels thread and agree with
your reasoning there. But here you are off base because of misuse of
terms that lead to misconceptions. Sorry. Please read to the end before
writing any reply.

Yes so.
There is no 'digital ratio' for NTSC broadcast video (the topic for
tuners). It's an analog system.

Analog video as seen in television sets and analog monitors has NO
pixels. They are only a factor in digital displays. Thus the term
"digital ratio" which is the number of columns vs rows.

What you're describing is, as was said, the DVD encoding and that
isn't the same thing.

I am not describing "DVD encoding" at all. I am talking about displays,
analog and digital.

The electron beam is going horizontally back and forth to make the
horizontal scan lines but how many of them it stacks going from top to
bottom is the vertical resolution (at best).

No it is not. You are confusing the number of horizontal scanning lines
with "vertical resolution", which is a measurement. (I have worked in
professional television production and post production for 25 years).
Vertical resolution is a measurement of the number of vertical lines
that can be resolved in a display. IOW, how many individual vertical
lines can be seen on a display. Thus "vertical resolution" depends on a
horizontal scan, passing across each line.
And NTSC does 525 of them
by the time it goes from top to bottom (twice: 252.5 each half
interleaved with the other half).



You've got it switched.

No, I have it right, as described above. I've made my living in this biz
for a long, long time.

No. "Vertical resolution" is how many scan lines there are, assuming
interleave is working perfectly.

Wrong. See above.

Perhaps better, consider the meaning of the term "resolution". If we
accept your erroneous interpretation for sake of discussion, how many
horizontal lines can be discerned in a 500 horizontal scan line display?
(we will exclude any consideration of lines only carrying broadcast tech
data apart from visual information). 500? Nope. That would be a solid
display. 500 black lines would be a black screen. 500 white lines would
be a white screen. "Resolution" means how many individual lines can be
resolved. Theoretically in this case the max answer would be 250 lines,
alternating black and white, so that the individual lines can be seen.
In real world applications this level is not reached. > How many
"vertical lines can be discerned" is the horizontal > resolution. > >>
(Think about it. you have a bunch of vertical lines. > > No, you think
about it. 'We' didn't have a "bunch of vertical lines" > till you
decided to claim that's what we meant when we didn't.

You may not mean it, but the professional world of video defines
vertical resolution as the ability to resolve individual vertical lines.
It's part of the SMPTE definitions and standards and has been for
decades. That's the world I come from and the standards we use.

Only because you're jumbling the words midstream. No one meant
"vertical lines" as in lines going up and down on the screen. They
meant the vertical resolution is how many scan lines there are.

Exactly. Read your words closely.

Take a breath because I am seeking understanding, not a fight.

Your last line reads, "They meant the vertical resolution is how many
scan lines there are."

Those are Two Different Things, which has been my whole point. "How many
scan lines there are" is, well, "how many scan lines there are".

"Vertical Resolution" is the ability to discern vertical details.

Those are two completely different issues. My posting was about the
correct definitions of those two issues.

The rest of your post quotes my comments on the loss of resolution when
passing through copied generations from the master down to copies,
thence through transmitters, into receivers, and eventual home display.
We have no apparent disagreement there.

And you note that:

"And there is the problem because the signal is encoded NTSC to get on
that radio frequency and NTSC bandwidth, hence the available resolution
when decoded, stinks."

Yes, and you are on the same page with me if you will consider your use
of terms here.

My point in talking about that procession is that the number of scanning
lines, or rows of pixels on a digital display, do not change. But the
quality of the picture degrades substantially - it loses resolution.
Because, once again, scanning lines and resolution being separate
matters. Resolution is a quality issue. Scanning lines, or digital pixel
ratios, are technical specifications. You can have a very high number of
scan lines. such as 1080 for HD, but still have a very low _Resolution_
image displayed (perhaps as low as "180 vertical lines of resolution".

So an NTSC system will not have "525 lines of vertical resolution". Nor
will PAL have "625 lines of vertical resolution". They do have those
numbers of "scanning lines". (Not all of which are for visual image
capture but that's a whole different discussion).

Hope this clarifies. When talking tech it is important to be careful and
precise in the use of terms. The short cut memory is that scanning
lines, or rows & columns of pixels, are a technical specification.
Resolution is a quality issue.

FWIW the confusion between the two is not new. Been going on among the
lay for about 60 years.
 
D

David Maynard

Dave, I have read your posts on the IDE Channels thread and agree with
your reasoning there. But here you are off base because of misuse of
terms that lead to misconceptions. Sorry. Please read to the end before
writing any reply.

I have read to the end and you're misinterpreting digital terms into analog
terms.


No, not so.
Analog video as seen in television sets and analog monitors has NO
pixels. They are only a factor in digital displays. Thus the term
"digital ratio" which is the number of columns vs rows.

Which is what I just said with "It's an analog system."
I am not describing "DVD encoding" at all. I am talking about displays,
analog and digital.

The number of pixels you mentioned was DVD resolution.

No it is not. You are confusing the number of horizontal scanning lines
with "vertical resolution", which is a measurement.

Not in the computer display world. "Vertical resolution" is how many pixels
are displayed top to bottom and that is one per visible scan line. E.g. the
original VGA 640x480 monitor displays 480 horizontal scan lines and that is
the '480' portion of the 640x480 resolution spec. And you can easily
calculate the minimum video card frame memory needed to create it, 640
pixels per line times 480 scan lines times the bits color depth. For a 256
color display, 1 byte per pixel, that's 640 x 480 = 307,200 bytes and you'd
need a 512k video card to do it (256k being too small).


(I have worked in
professional television production and post production for 25 years).
Vertical resolution is a measurement of the number of vertical lines
that can be resolved in a display. IOW, how many individual vertical
lines can be seen on a display. Thus "vertical resolution" depends on a
horizontal scan, passing across each line.

The very confusion is that you've worked for years in television production
and are trained on the 'lines' method of resolution but no one here is
speaking of 'lines' in that context and that is the error you're making in
presuming that when someone speaks of 'lines' that it has anything to do
with the 'lines of resolution' you are used to working with.

No, I have it right, as described above. I've made my living in this biz
for a long, long time.

Which is why you have it switched: The computer world isn't that biz and
the terminology is different.

Wrong. See above.

A computer monitor is not designed not spec'd for 'broadcast TV' and
computer people do not use 'broadcast TV' terms when speaking of
"resolution." It is a different world than the one you are used to.

In the computer monitor world, the number of visible scan lines is, indeed,
the vertical resolution and not the 'discernible lines' you are used to.

Perhaps better, consider the meaning of the term "resolution". If we
accept your erroneous interpretation for sake of discussion, how many
horizontal lines can be discerned in a 500 horizontal scan line display?

Which is precisely the problem. 'Resolution' in the context of computer
displays is not measured by 'discernible lines'.
(we will exclude any consideration of lines only carrying broadcast tech
data apart from visual information). 500? Nope. That would be a solid
display. 500 black lines would be a black screen. 500 white lines would
be a white screen.

Yes, it would be. Which is irrelevant as it still takes 500 (vertical)
pixels, all the same illumination, to create that solid display and the
computer must process and send all 500 to create that 'solid display'.

"Resolution" means how many individual lines can be
resolved.

Not in the computer display world. "Resolution" is how many pixels can be
displayed and whether you make 'lines' with them, or not, is irrelevant.
Theoretically in this case the max answer would be 250 lines,
alternating black and white, so that the individual lines can be seen.

That is, indeed, how the broadcast TV industry determines 'lines of
resolution' but it is not how the computer display world speaks of resolution.

In real world applications this level is not reached. > How many
"vertical lines can be discerned" is the horizontal > resolution. > >>
(Think about it. you have a bunch of vertical lines. > > No, you think
about it. 'We' didn't have a "bunch of vertical lines" > till you
decided to claim that's what we meant when we didn't.

You may not mean it, but the professional world of video defines
vertical resolution as the ability to resolve individual vertical lines.
It's part of the SMPTE definitions and standards and has been for
decades. That's the world I come from and the standards we use.

I understand the 'TV' world you are in but that is not the one the computer
display is in.

When you see a 17 inch monitor spec'd with 1280x1024 maximum resolution
they are not speaking of your TV world 'lines'. They are speaking of pixels
and if they're all the same illumination then, yes, you'd have a 'blank'
screen of that illumination.

And to get the 1024 pixels of vertical resolution it takes 1024 scan lines,
one per pixel, (plus off screen and sync). And that is precisely how it is
sent to the monitor.

Exactly. Read your words closely.

Take a breath because I am seeking understanding, not a fight.

Your last line reads, "They meant the vertical resolution is how many
scan lines there are."

Those are Two Different Things, which has been my whole point. "How many
scan lines there are" is, well, "how many scan lines there are".

"Vertical Resolution" is the ability to discern vertical details.

Not in the computer display world it isn't or, rather, it's inherent in the
monitor's pixel resolution specification since those pixels can, of course,
be 'discerned'.

Those are two completely different issues. My posting was about the
correct definitions of those two issues.

The rest of your post quotes my comments on the loss of resolution when
passing through copied generations from the master down to copies,
thence through transmitters, into receivers, and eventual home display.
We have no apparent disagreement there.

And you note that:

"And there is the problem because the signal is encoded NTSC to get on
that radio frequency and NTSC bandwidth, hence the available resolution
when decoded, stinks."

Yes, and you are on the same page with me if you will consider your use
of terms here.

My point in talking about that procession is that the number of scanning
lines, or rows of pixels on a digital display, do not change. But the
quality of the picture degrades substantially - it loses resolution.

It shouldn't or, rather, doesn't have to. Vertical resolution is entirely
preserved as each scan line is converted one to one. I.E first visible scan
line is converted to a digital scan line, second to the second,... up to
the maximum 'visible' of 480. There is no loss to interleave jitter,
phosphor blooming, or anything else.

Horizontal is affected by the sample rate (and clock stability) vs nyquist
frequency of the original content and the typical computer tuner card is
sampling lower than optimum but, in theory, there's no reason why it
couldn't be sampled at, say, 2048 per horizontal line even though the video
content is lower than that. It isn't typically done in your 'normal' tv
tuner card because that would require some serious digital processing and
simply sampling at the corresponding horizontal pixel rate matching the
vertical scan lines, e.g. 640(720)x480, is a lot simpler as it translates
directly to the computer screen it's eventually going to. It doesn't need
'conversion' past the digitizing and de-interlace (which is why some cards
get really nasty and just pull 240).

Because, once again, scanning lines and resolution being separate
matters. Resolution is a quality issue.

It isn't in computer displays or, rather, it's inherent in the pixel
resolution specification (barring how 'sharply' the pixels are displayed)

Put another way, both a 'low quality' and a 'high quality' 1280x1024
monitor will display 1280x1024 pixels but the 'high quality' one may be
'sharper' than the other. The 'high quality' monitor's tube might, in
theory, be capable of displaying more of your 'lines' but it can't because
of the scan rates. e.g. the video amplifier bandwidth is sufficient to do
1280 horizontal pixels and there are 1024 visible scan lines making up the
1024 vertical resolution, in both the 'high quality' and 'low quality'
monitors.

You simply never have the 'computer world' equivalent of your 'TV world'
case where "this crappy 1280x1024 monitor only does '400 lines'." Computer
displays are specified by pixel resolution and if it only does '400 lines'
then it isn't a 1280x1024 monitor.

Scanning lines, or digital pixel
ratios, are technical specifications. You can have a very high number of
scan lines. such as 1080 for HD, but still have a very low _Resolution_
image displayed (perhaps as low as "180 vertical lines of resolution".

You're speaking in TV terms again. A computer display spec'd at 1080 will
do 1080. What you send to it, e.g. a low resolution TV signal, is another
matter entirely.

Again, you simply don't have a computer world equivalent to the 'TV world'
case where vertical resolution is lost to zitter, overlapping scan lines,
or any of the other 'TV monitor' maladies. It wouldn't be a 1080 monitor.

So an NTSC system will not have "525 lines of vertical resolution". Nor
will PAL have "625 lines of vertical resolution". They do have those
numbers of "scanning lines". (Not all of which are for visual image
capture but that's a whole different discussion).

The point here is that when you hear people in the computer world speak of
'lines of vertical resolution', in that context, they do, indeed, mean the
scan lines since that is precisely how computer displays operate, how
they're spec'd, and is the meaning of 'resolution' in their context.

Hope this clarifies. When talking tech it is important to be careful and
precise in the use of terms. The short cut memory is that scanning
lines, or rows & columns of pixels, are a technical specification.
Resolution is a quality issue.

Hopefully you see what I am talking about too.
FWIW the confusion between the two is not new. Been going on among the
lay for about 60 years.

Except there weren't any PC monitors to cause the particular kind of
confusion we have here.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top