GeForce2 MX400 TV-Out (4 pins S-Video)

R

Red Cloud

I need a 1024x768 TV-out which offers by most ATI AGP card. Have to
use GeForce2 MX400
on IntelMB from Asrock since most early version of ATI Radeon AGP are
all incompatible with IntelMB. The problem is MX400 TV-out offers
800by600 maximum output and the picture does not fit on TV screen
which does by ATI radeon card. This is what I like about ATI card for
multi-tasking like using for DVD watching. I tried Nvidia graphic
utility but no luck. Display Property offers two resolution options
(600x400 and 800x600) and they are not zoom to fit on TV screen.
I dunno this is a case of graphic software problem or the Geforce2
graphic card itself simply is not good for DVD watching. I only
tried early Nvidia utility program and hasn't use the latest one
since my card is old version.
 
F

Flasherly

I need a 1024x768 TV-out which offers by most ATI AGP card. Have to
use GeForce2 MX400
on IntelMB from Asrock since most early version of ATI Radeon AGP are
all incompatible with IntelMB. The problem is MX400 TV-out offers
800by600 maximum output and the picture does not fit on TV screen
which does by ATI radeon card. This is what I like about ATI card for
multi-tasking like using for DVD watching. I tried Nvidia graphic
utility but no luck. Display Property offers two resolution options
(600x400 and 800x600) and they are not zoom to fit on TV screen.
I dunno this is a case of graphic software problem or the Geforce2
graphic card itself simply is not good for DVD watching. I only
tried early Nvidia utility program and hasn't use the latest one
since my card is old version.

Check the MX400 drivers. I did get some improvements with (also
problems I didn't have with an ATI board prior) color temperature
issues on a onboard MB's GeForce chip when I updated the drivers. I
would avoid SVideo if all possible for better options with a VGA
connector on the monitor.
 
F

Flasherly

Check the MX400 drivers. I did get some improvements with (also
problems I didn't have with an ATI board prior) color temperature
issues on a onboard MB's GeForce chip when I updated the drivers. I
would avoid SVideo if all possible for better options with a VGA
connector on the monitor.

Correction. I'm still using that ATI AGP 97/9600 serieson this
computer. Going on 10yrs old w/ similar 32" LCD via VGA (used to be
in place of where the GeForce now is). 37" LCD on the MB's GeForce
now, although you still want VGA connectors over SVideo. Much simpler
for compliance purposes to run a digital interface rather than at old
analog standards. (Also run both analog and a somewhat newer separate
audio signal encoded digitally over a red-laser carrier inside a SPDIF
cable. Not your average amp to that.)
 
R

Red Cloud

Correction.  I'm still using that ATI AGP 97/9600 serieson this
computer.  Going on 10yrs old w/ similar 32" LCD via VGA (used to be
in place of where the GeForce now is).  37" LCD on the MB's GeForce
now, although you still want VGA connectors over SVideo.  Much simpler
for compliance purposes to run a digital interface rather than at old
analog standards. (Also run both analog and a somewhat newer separate
audio signal encoded digitally over a red-laser carrier inside a SPDIF
cable.  Not your average amp to that.)

Nope I don't have extra LCD or LED. I like to use Sony big old analog
TV which has S-video input.
 
J

John Doe

A racist waste of energy...

--

Red Cloud said:
X-Received: by 10.66.227.7 with SMTP id rw7mr10347pac.19.1360569267182; Sun, 10 Feb 2013 23:54:27 -0800 (PST)
MIME-Version: 1.0
X-Received: by 10.50.34.167 with SMTP id a7mr422442igj.5.1360569266951; Sun, 10 Feb 2013 23:54:26 -0800 (PST)
Path: eternal-september.org!mx05.eternal-september.org!feeder.eternal-september.org!news.glorb.com!f6no9196900pbd.0!news-out.google.com!mj10ni8pbb.1!nntp.google.com!f6no9196892pbd.0!postnews.google.com!sx3g2000pbc.googlegroups.com!not-for-mail
Newsgroups: alt.comp.hardware.pc-homebuilt
Date: Sun, 10 Feb 2013 23:54:26 -0800 (PST)
Complaints-To: groups-abuse google.com
Injection-Info: sx3g2000pbc.googlegroups.com; posting-host=173.51.139.142; posting-account=J56LSgkAAADXgXF-d8W7le6CF3ZN1DW1
NNTP-Posting-Host: 173.51.139.142
References: <a4f8c691-bf8d-42c7-a217-1bbabd306569 wk7g2000pbc.googlegroups.com> <50cf6dc2-8138-41b1-997f-4b70f8a739b0 r14g2000yqe.googlegroups.com> <a9bdba2b-cfde-40eb-b5e7-cdce54a0c993 k8g2000yqb.googlegroups.com>
User-Agent: G2/1.0
X-HTTP-UserAgent: Mozilla/5.0 (Windows NT 5.1; rv:12.0) Gecko/20100101 Firefox/12.0,gzip(gfe)
Message-ID: <697d28d0-1278-4962-86fd-d569572daece sx3g2000pbc.googlegroups.com>
Subject: Re: GeForce2 MX400 TV-Out (4 pins S-Video)
From: Red Cloud <mmdir2005 yahoo.com>
Injection-Date: Mon, 11 Feb 2013 07:54:27 +0000
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable
Xref: mx05.eternal-september.org alt.comp.hardware.pc-homebuilt:26207



Nope I don't have extra LCD or LED. I like to use Sony big old analog
TV which has S-video input.
 
F

Flasherly

Nope I don't have extra LCD or LED. I like to use Sony big old analog
TV which has S-video input.

Good cards in their time, those ATI Radeon AGP series. I had two or
three of the models.
 
F

Flasherly

Good cards in their time, those ATI Radeon AGP series. I had two or
three of the models.

Btw- ATI's old enough also to be eclipsed in terms of raw GPU
processing power for encodes by effectively all included MB's chipped
video. I know mine (one of the last/latest in the series of AGP
Radeons) won't stand up to the newer inexpensive MB's GeForce I've run
both both by low-end single- or decent dual-core operation. Saw a 32"
Toshiba LED for $199 last week -- newer LCDs, many of the lesser
brands especially now are breaking like hell. I suppose somebody got
bored or forgot all about that 50K-hr. materials longevity. Now, of
course, what they're saying is LED technology should hold up longer.
 
G

GMAN

Btw- ATI's old enough also to be eclipsed in terms of raw GPU
processing power for encodes by effectively all included MB's chipped
video. I know mine (one of the last/latest in the series of AGP
Radeons) won't stand up to the newer inexpensive MB's GeForce I've run
both both by low-end single- or decent dual-core operation. Saw a 32"
Toshiba LED for $199 last week -- newer LCDs, many of the lesser
brands especially now are breaking like hell. I suppose somebody got
bored or forgot all about that 50K-hr. materials longevity. Now, of
course, what they're saying is LED technology should hold up longer.


LED reffers to the backlighting, they are still just LCD panels backlit by
LED's .
 
F

Flasherly

LED reffers to the backlighting, they are still just LCD panels backlit by
LED's .

Yea, I've got florescent tubes in all my LCD monitors, tubes on all
sides surrounding the surface of transistor-to-transistor logic of
"liquid" crystalline film arrays comprising a surface display of 1360
units across by 768 tall, natively, if not splitting or combining any
two of them;- each exhibiting five states: clear/white, opaque/black,
within 3 substrates of cyan, magenta, or yellow. I'll presume, for
the sake of discussion, LEDs are practically a sole difference, for
better qualifying light-emitting diodes, similarly arranged by outset
banks to supplant base white from a gaseous filament of yesteryear's
LCD. Not that I'm going anywhere in the foreseeable soon -- all my
LCDs flat-assed won't quit: 1) earliest, a Samsung's H/V-&swivel 19"
PC display for word processing, 2) following the first sub-$1000+
priceings on first-made 32" monitor/TV Olevia LCDs (I got it around
$600), 3) last, within a year or two of the Olevia, being a commercial-
application monitor/TV NEC sold for continuous use in airports, bus
terminals, or operating rooms using microscopics or camera-orient
diagnostic tools.
 
P

Paul

Red said:
I need a 1024x768 TV-out which offers by most ATI AGP card. Have to
use GeForce2 MX400
on IntelMB from Asrock since most early version of ATI Radeon AGP are
all incompatible with IntelMB. The problem is MX400 TV-out offers
800by600 maximum output and the picture does not fit on TV screen
which does by ATI radeon card. This is what I like about ATI card for
multi-tasking like using for DVD watching. I tried Nvidia graphic
utility but no luck. Display Property offers two resolution options
(600x400 and 800x600) and they are not zoom to fit on TV screen.
I dunno this is a case of graphic software problem or the Geforce2
graphic card itself simply is not good for DVD watching. I only
tried early Nvidia utility program and hasn't use the latest one
since my card is old version.

There are a number of effects to consider, before selecting a
high resolution on S-Video.

First, the NTSC or PAL standards for baseband, have limited bandwidth.
When a TV station transmits an "analog" signal, the bandwidth is about 6MHz.
I think the video content is set up, to fit within a 4MHz bandwidth,
with perhaps some appropriately sized guard bands on the side.

This affects the video resolution. The DAC on a video card, may have
a stated bandwidth of 400MHz or so (when driving a signal over the VGA
connector), and the resolution that goes with that could be 2048x1576
at 60Hz or some other rather large resolution. So that gives a rough idea,
how the channel bandwidth, may affect the resolution and frame rate.
A 4MHz path, simply can't be expected to pass a resolution like that.

If the TV type signal is band limited, to fit within 4MHz (even when
being sent to computer type equipment), then you can't expect the
equivalent resolution to be that high.

The TV signal uses an interleaved signal. 262.5 lines are painted
on the screen for the odd frame, then the other 262.5 lines are painted
on the screen for the even frame. The actual vertical resolution can't
really be higher than that, as the scanning process (the familiar 15KHz
whine the TV makes), is fixed. So no matter how you set the resolution
on the video card (640x480, 800x600, 1024x768), the interleaved scanning
pattern doesn't change. A TV set is not multisync like a computer monitor
is. It only works at that approx. 15KHz rate.

And then this brings up the issue of aliasing. The video card design
knows it's taking a "higher resolution thing" and stuffing it into
a display format with much lower limitations. On scan converters,
they use convolution (the mathematical process similar to a moving
average), to take 3, 5, or 7 adjacent lines, and use the information
to compute what should show up on a scan line. The video card has
to do something similar (and the video card may be fixed in terms
of its convolution steps - I don't know the details of where in the
video card that is done or how).

If you watch the output of an actual scan converter (we had one at
work), you'll find that horizontal lines look "thin", and the image
may look washed out. And you might blame that on the image processing,
including the convolution, for the result. A scan converter may support
input resolutions as high as 1600x1200, but that's really too high for
line art (computer output) to show properly.

So now, why do we use the higher settings ?

First, depending on the TV set, the TV has the notions of "overscan"
and "underscan". A TV from 1950, might have a signal that scans 30%
past the edge of the screen. Only the center portion of the scan
process on the TV set was "linear", and the image at the edges of the
screen is thrown away. The signal splatters past the edge of the screen.
If you had an LCD TV set, it's perfectly linear, and doesn't need
overscan to work. On the video card, there should be a control for
setting the overscan, so that the image is scaled properly for the
type of TV set. If it's an old old CRT TV set, you would want the
computer S-Video signal to have the overscan enabled.

You select 1024x768, to attempt to squeeze more information on the
screen. You do that, when viewing computer program output, line art
and the like. Setting the scaling of the screen output
("DPI output" in the Display control panel), might also affect
that though. My screen might be set to 120 DPI, in an attempt
to make print larger on my LCD computer monitor. If I cranked down
the DPI setting, it might cause more text characters to be visible
in the TV screen at 800x600.

With the resolution limitations, it's almost impossible to read
text on the TV screen. I've tried to use a TV set as a console
for a Linux box, and it was a miserable experience. I don't consider
it to be a practical solution for that purpose.

However, the human eye is quite tolerant of that sort of thing,
when displaying movie content. Even at 800x600, you can view a
movie, and get most of the information content.

Now, my experience with cheap TV sets is, I can get a better
looking movie output, if I actually connect the computer
output to a channel 3 RF modulator, then go through the RF
path on the TV set. Don't ask how that works! Or how it can
possibly look better. My TV sets don't seem to handle the
color well, over composite or S-Video, and yet if I go through
the antenna input instead, I get a usable image for movie
viewing. This is an example of an "RF modulator", that takes
a TV-type video signal, and sends it on channel 3 or 4. It
seems like the RF modulator has better processing of an
incoming TV signal, than the TV set does. At $29, you
could try this, and see if it looks better. Maybe your
TV sets work better than mine do, and you won't need this
path to get good results.

RF Modulator $29
http://www.radioshack.com/product/index.jsp?productId=2103095

So the only reason for switching to 1024x768, is it "makes
more computer output, icons and text, appear on the screen".
For movie viewing, you might do just as well, while viewing
at 800x600. Given that the scanning process uses a relatively
low resolution, and the "sharpness" of the signal, is even
poorer than that.

On TV sets, people would use things like this image, to
measure the actual sharpness. The idea is to determine
the "just-resolvable lines". And that number is quite
poor on a TV set. But for movie viewing, you aren't
resolving lines, and the human eye makes up for
the actual equivalent display resolution.

http://www.cdr-zone.com/forum/files/sharpness_134.jpg

Try adjusting the overscan/underscan. That should be
enough to get a useful image on the set. And don't expect
text and line art to render on the TV, because the
6MHz channel spacing determined a lot of the characteristics
of such transmission paths.

If you could use "component" output to the TV, such as YPbPr,
that has a much higher bandwidth, and doesn't have the TV-type
limitations. Newer TV sets had things like that on the back.
The connector colors might be RGB, but the signal format
is YPbPr. I doubt an MX400 has "component", but some newer
cards do. The very latest video cards, no longer bother
with the DIN connector on the faceplate, and put more
useless connectors on there instead (like DisplayPort).

http://en.wikipedia.org/wiki/YPbPr

Paul
 
R

Red Cloud

There are a number of effects to consider, before selecting a
high resolution on S-Video.

First, the NTSC or PAL standards for baseband, have limited bandwidth.
When a TV station transmits an "analog" signal, the bandwidth is about 6MHz.
I think the video content is set up, to fit within a 4MHz bandwidth,
with perhaps some appropriately sized guard bands on the side.

This affects the video resolution. The DAC on a video card, may have
a stated bandwidth of 400MHz or so (when driving a signal over the VGA
connector), and the resolution that goes with that could be 2048x1576
at 60Hz or some other rather large resolution. So that gives a rough idea,
how the channel bandwidth, may affect the resolution and frame rate.
A 4MHz path, simply can't be expected to pass a resolution like that.

If the TV type signal is band limited, to fit within 4MHz (even when
being sent to computer type equipment), then you can't expect the
equivalent resolution to be that high.

The TV signal uses an interleaved signal. 262.5 lines are painted
on the screen for the odd frame, then the other 262.5 lines are painted
on the screen for the even frame. The actual vertical resolution can't
really be higher than that, as the scanning process (the familiar 15KHz
whine the TV makes), is fixed. So no matter how you set the resolution
on the video card (640x480, 800x600, 1024x768), the interleaved scanning
pattern doesn't change. A TV set is not multisync like a computer monitor
is. It only works at that approx. 15KHz rate.

And then this brings up the issue of aliasing. The video card design
knows it's taking a "higher resolution thing" and stuffing it into
a display format with much lower limitations. On scan converters,
they use convolution (the mathematical process similar to a moving
average), to take 3, 5, or 7 adjacent lines, and use the information
to compute what should show up on a scan line. The video card has
to do something similar (and the video card may be fixed in terms
of its convolution steps -  I don't know the details of where in the
video card that is done or how).

If you watch the output of an actual scan converter (we had one at
work), you'll find that horizontal lines look "thin", and the image
may look washed out. And you might blame that on the image processing,
including the convolution, for the result. A scan converter may support
input resolutions as high as 1600x1200, but that's really too high for
line art (computer output) to show properly.

So now, why do we use the higher settings ?

First, depending on the TV set, the TV has the notions of "overscan"
and "underscan". A TV from 1950, might have a signal that scans 30%
past the edge of the screen. Only the center portion of the scan
process on the TV set was "linear", and the image at the edges of the
screen is thrown away. The signal splatters past the edge of the screen.
If you had an LCD TV set, it's perfectly linear, and doesn't need
overscan to work. On the video card, there should be a control for
setting the overscan, so that the image is scaled properly for the
type of TV set. If it's an old old CRT TV set, you would want the
computer S-Video signal to have the overscan enabled.

You select 1024x768, to attempt to squeeze more information on the
screen. You do that, when viewing computer program output, line art
and the like. Setting the scaling of the screen output
("DPI output" in the Display control panel), might also affect
that though. My screen might be set to 120 DPI, in an attempt
to make print larger on my LCD computer monitor. If I cranked down
the DPI setting, it might cause more text characters to be visible
in the TV screen at 800x600.

With the resolution limitations, it's almost impossible to read
text on the TV screen. I've tried to use a TV set as a console
for a Linux box, and it was a miserable experience. I don't consider
it to be a practical solution for that purpose.

However, the human eye is quite tolerant of that sort of thing,
when displaying movie content. Even at 800x600, you can view a
movie, and get most of the information content.

Now, my experience with cheap TV sets is, I can get a better
looking movie output, if I actually connect the computer
output to a channel 3 RF modulator, then go through the RF
path on the TV set. Don't ask how that works! Or how it can
possibly look better. My TV sets don't seem to handle the
color well, over composite or S-Video, and yet if I go through
the antenna input instead, I get a usable image for movie
viewing. This is an example of an "RF modulator", that takes
a TV-type video signal, and sends it on channel 3 or 4. It
seems like the RF modulator has better processing of an
incoming TV signal, than the TV set does. At $29, you
could try this, and see if it looks better. Maybe your
TV sets work better than mine do, and you won't need this
path to get good results.

RF Modulator $29http://www.radioshack.com/product/index.jsp?productId=2103095

So the only reason for switching to 1024x768, is it "makes
more computer output, icons and text, appear on the screen".
For movie viewing, you might do just as well, while viewing
at 800x600. Given that the scanning process uses a relatively
low resolution, and the "sharpness" of the signal, is even
poorer than that.

On TV sets, people would use things like this image, to
measure the actual sharpness. The idea is to determine
the "just-resolvable lines". And that number is quite
poor on a TV set. But for movie viewing, you aren't
resolving lines, and the human eye makes up for
the actual equivalent display resolution.

http://www.cdr-zone.com/forum/files/sharpness_134.jpg

Try adjusting the overscan/underscan. That should be
enough to get a useful image on the set. And don't expect
text and line art to render on the TV, because the
6MHz channel spacing determined a lot of the characteristics
of such transmission paths.

If you could use "component" output to the TV, such as YPbPr,
that has a much higher bandwidth, and doesn't have the TV-type
limitations. Newer TV sets had things like that on the back.
The connector colors might be RGB, but the signal format
is YPbPr. I doubt an MX400 has "component", but some newer
cards do. The very latest video cards, no longer bother
with the DIN connector on the faceplate, and put more
useless connectors on there instead (like DisplayPort).

http://en.wikipedia.org/wiki/YPbPr

    Paul


ATI Radeon card S-video output on 800x600 = occupying the whole TV
screen by zoom .

Nvidia card S-video output on on 800x600 = leave blanks on four side
of TV.

ATI card is automatically occupying whole TV screen. But the Nvidia
card gives a more clear and detail screen. Don't know why Nvidia has
half of memory compare to ATI but produce a better screen despite
Nvidia is old model. I dunno what's going on here. The big different
is Nvidia is in Intel board computer with intel embeded video. ATI
card in in AMD model board.
 
P

Paul

Red said:
ATI Radeon card S-video output on 800x600 = occupying the whole TV
screen by zoom .

Nvidia card S-video output on on 800x600 = leave blanks on four side
of TV.

ATI card is automatically occupying whole TV screen. But the Nvidia
card gives a more clear and detail screen. Don't know why Nvidia has
half of memory compare to ATI but produce a better screen despite
Nvidia is old model. I dunno what's going on here. The big different
is Nvidia is in Intel board computer with intel embeded video. ATI
card in in AMD model board.

Actually, one of the reasons is the nature of the external TV driver chip.
There are various brands of chips, and you can Google on Chrontel
for an example.

http://nvidia.hardwareforumz.com/TV-ftopict32893.html

Another example.

http://en.community.dell.com/support-forums/laptop/f/3519/t/4567195.aspx

"For nVidia cards with a tv encoder from Chrontel (version CH7001-CH7008),
a third party utility, called TV-Tool, exists to switch to overscan."

Keep digging and you might find a solution.

HTH,
Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top