Screen resolution vs DVD resolution

  • Thread starter Thread starter Mitch Gallant
  • Start date Start date
M

Mitch Gallant

Tyical computer aspect ratio is about 4:3 or 1.333 (e.g. 1024 x 768 pixels)
which is also about the same as tyical standard TV set aspect ratios.

Why was the standard DVD-video resolution set at 1.5? (820x480) ?

And SVCD is 480x480! How were these standards extablished?

- Mitch
 
Until a few years ago, the two had nothing to do with each other. Well,
other than most picture tubes (crt) were manufactured with the 4:3 ratio.

You have to keep in mind just how very different a computer display is from
a TV and that they work in completely different ways. In fact the
conversion from computer display to TV (NTST) is one of the most demanding
conversions I can think of. Quality "studio grade" hardware to do this
starts at around $10K and goes up from there. (Our $200 or so PC video card
is NOT going to do that.)

A television works at about 30 frames a second. But in reality it is really
60 frames a second because the signal is interlaced. Interlaced meaning
that first the even numbered lines are drawn and then the odd number lines.
(This of course is based upon 60 cycle per sec AC current. In other parts
of the world it's different. In Europe it's 50 cycles a second and they use
the PAL standard for TV.)

Also keep in mind that there is a lot more to the tv picture than what you
see. Ever seen an old tv where the pictures "rolled" and you had a band
between the two pictures? This is a data stream (in analog format) that can
and is used for all sorts of things. When you create a NTST compliant
signal this too must be taken into account. Some apps do it well, others
don't.

And finally, a tv doesn't really have a set resolution like a computer
display does. It can display a very low resolution image as well as a high
resolution image. But in no case will it ever have the resolution of a
computer screen, even a low res computer screen.

Oops, one more. You can't really compare the two but if you converted a
broadcast analog signal to digital and captured ALL of it, it would be
around 150 megs a second. Most video cards compress the heck out of the
signal (lossy) when you capture to a hard drive. Even if you say no
compression. I don't think there is a PC made that could handle the data
flow otherwise.

Yes, there really is a reason you can buy an entire 21" tv for $149 and a
decent 21" monitor costs 4 or 5 times that much. <g>


Austin Myers
MS PowerPoint MVP Team

Provider of PFCMedia http://www.pfcmedia.com
 
Thanks Austin,
Yes I understand the difference in technology used by TV versus computer
monitors. I was wondering more about how DVD authoring programmes use the
mapping from say WMV to the video display when a DVD-video is generated.
This still seems a bit confusing to me in the the DVD authoring conversion
tools.
- Mitch
 
Thanks Austin,
Yes I understand the difference in technology used by TV versus computer
monitors. I was wondering more about how DVD authoring programmes use the
mapping from say WMV to the video display when a DVD-video is generated.

That's like asking Coke for their formula. <g> All kidding aside, how it's
done is what seperates really bad from the really good apps.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top