What determines monitor resolution?

W

Walter R.

I am running Windows XP Sp3. Working fine. My computer is 8 years old with
an Intel IV 1500 MHZ

I use a 19" LED monitor and have set to run at 1024x768 resolution in
Windows desktop properties.
The maximum resolution Win XP will let me run is 1280x1024.

I am using a NVDIA Geforce4 MX 4000 PCI Video Card. (The video card that
came with the computer gave up its ghost and I replaced it with this card.)

My question is: I would like to buy a 22" LED Widescreen monitor (Hyundai on
sale at $149). It has a "native resolution of 1680x1050.

Will my current card and the new monitor let me increase the resolution to
1680x1050? What determines the resolution-capability? The monitor, the video
card, the driver or XP?

I would hate to buy a new monitor and then have to return it.

Thank you
 
S

smlunatick

I am running Windows XP Sp3. Working fine. My computer is 8 years old with
an Intel IV 1500 MHZ

I use a 19" LED monitor and have set to run at 1024x768 resolution in
Windows desktop properties.
The maximum resolution Win XP will let me run is 1280x1024.

I am using a NVDIA Geforce4 MX 4000 PCI Video Card. (The video card that
came with the computer gave up its ghost and I replaced it with this card..)

My question is: I would like to buy a 22" LED Widescreen monitor (Hyundaion
sale at $149).  It has a "native resolution of 1680x1050.

Will my current card and the new monitor let me increase the resolution to
1680x1050? What determines the resolution-capability? The monitor, the video
card, the driver or XP?

I would hate to buy a new monitor and then have to return it.

Thank you

Resolution is control by a few different things. First, the video
card "specification" would state the maximum resolution. The second
is the video card drivers, which will tell your Windows how to access
the maximum resolution for the video card. The third is the LCD
monitor's internal logic which also stores the resolution settings in
the EDID.

Check with on nVidia's web site for the specifications on the Geforce4
MX 4000 video card and their video card drivers.

As for the LCD, if you connect it to a DVI connector, on the video
card, you will not be able to set the resolution. You might be able
to set the resolution if you can use the VGA port of the monitor, if
there is one.
 
S

SC Tom

I am running Windows XP Sp3. Working fine. My computer is 8 years old with
an Intel IV 1500 MHZ

I use a 19" LED monitor and have set to run at 1024x768 resolution in
Windows desktop properties.
The maximum resolution Win XP will let me run is 1280x1024.

I am using a NVDIA Geforce4 MX 4000 PCI Video Card. (The video card that
came with the computer gave up its ghost and I replaced it with this
card.)

My question is: I would like to buy a 22" LED Widescreen monitor (Hyundai
on
sale at $149). It has a "native resolution of 1680x1050.

Will my current card and the new monitor let me increase the resolution to
1680x1050? What determines the resolution-capability? The monitor, the
video
card, the driver or XP?

I would hate to buy a new monitor and then have to return it.

Thank you

Resolution is control by a few different things. First, the video
card "specification" would state the maximum resolution. The second
is the video card drivers, which will tell your Windows how to access
the maximum resolution for the video card. The third is the LCD
monitor's internal logic which also stores the resolution settings in
the EDID.

Check with on nVidia's web site for the specifications on the Geforce4
MX 4000 video card and their video card drivers.



"As for the LCD, if you connect it to a DVI connector, on the video
card, you will not be able to set the resolution. You might be able
to set the resolution if you can use the VGA port of the monitor, if
there is one."

That's not correct. I have an LCD monitor connected to the DVI port on me
nVidia 9800GT, and I can change the resolution at will. Maybe some are
locked and some aren't?

SC Tom
 
J

Jasper

SC Tom said:
Resolution is control by a few different things. First, the video
card "specification" would state the maximum resolution. The second
is the video card drivers, which will tell your Windows how to access
the maximum resolution for the video card. The third is the LCD
monitor's internal logic which also stores the resolution settings in
the EDID.

Check with on nVidia's web site for the specifications on the Geforce4
MX 4000 video card and their video card drivers.



"As for the LCD, if you connect it to a DVI connector, on the video
card, you will not be able to set the resolution. You might be able
to set the resolution if you can use the VGA port of the monitor, if
there is one."

That's not correct. I have an LCD monitor connected to the DVI port on me
nVidia 9800GT, and I can change the resolution at will. Maybe some are
locked and some aren't?

SC Tom

Max Resolution for the 64 meg card is 2048 X 1536. If you have one with more
ram all the better. Also the new monitor will come with a driver disk to
setup the monitor for your system.
 
P

Paul

Walter said:
I am running Windows XP Sp3. Working fine. My computer is 8 years old with
an Intel IV 1500 MHZ

I use a 19" LED monitor and have set to run at 1024x768 resolution in
Windows desktop properties.
The maximum resolution Win XP will let me run is 1280x1024.

I am using a NVDIA Geforce4 MX 4000 PCI Video Card. (The video card that
came with the computer gave up its ghost and I replaced it with this card.)

My question is: I would like to buy a 22" LED Widescreen monitor (Hyundai on
sale at $149). It has a "native resolution of 1680x1050.

Will my current card and the new monitor let me increase the resolution to
1680x1050? What determines the resolution-capability? The monitor, the video
card, the driver or XP?

I would hate to buy a new monitor and then have to return it.

Thank you

There are some sample specifications for an MX4000, on this page.

http://www3.dealtime.com/xPF-Evga-e...-DDR-Video-Card-w-TV-Out-Retail-Free-2nd-Day-

Many of the MX4000 examples I could find, had VGA connectors on them. The
advert above mentioned "RAMDAC Speed 350 MHz" and "Max. Screen Resolution 2048 x 1536".

It is not actually speed, it is RAMDAC bandwidth. The bandwidth of a signal,
in part, determines how fast the signal can rise or fall. To make sharp
pixels on the screen, the signal has to rise and fall quickly. There is
a relationship between the claimed signal bandwidth, and the maximum resolution
supported, so in principle, if a person was given one of those pieces of
information, they could work out the other.

Even with that information though, there is the issue of the quality of the
connection. At extremely high resolution settings, a VGA connection to a
monitor gives a less than perfect picture. There could be reflections
or distortion on the cable, and the results might visibly affect the
picture. The picture might be a bit fuzzy, or have other imperfections.
The thing is, with VGA, every imperfection counts, as they may all be
visible by the time the imperfections get to the monitor.

The DVI digital connection, offers the ability to keep the signal perfect,
from video card to monitor. There can still be imperfections, but they
don't count unless they are severe enough. For example, if you use
too long a DVI cable, you start to see "colored snow" on the screen.
Each dot of colored show is a transmission error. As you shorten the
cable, such that it is in spec, the picture improves until each pixel
received, is exactly the right value. DVI tends to have a sharper
split, between perfect operation, and not-so-perfect operation. VGA
is bad all the time, comparatively speaking.

There is a summary of DVI resolutions supported here. I'll pick a couple
examples from the table.

http://en.wikipedia.org/wiki/Digital_Visual_Interface

"Example display modes (single link):

* HDTV (1920 × 1080) @ 60 Hz with CVT-RB blanking (139 MHz)"

A single link DVI connector, uses only half the pins that fit in the
connector shell. The 1920 x 1080 is an example of the limit for
this interface. The "139 MHz" is the "clock rate" on the cable.
It can go up to "165 MHz". The actual data carried on the wires,
is ten times faster. So if the cable is operated at 165MHz, the
RGB signals work at 1650 megabaud serially.

R 0123456789 10 bits on cable, gives 8 usable data bits
G 0123456789
B 0123456789 End result, 165 million 24 bit pixels per second.
____
Clock | |____|

The "dual link" DVI, uses two sets of RGB signals, and handles
monitors up to the Apple 30" LCD display. From the table

" (2560 × 1600) @ 60 Hz with GTF blanking (2 × 174 MHz) "

Cards with dual link capability, are a later generation of
silicon, and able to surpass "165MHz" interface limits. In
fact, the fastest offered cable rate now digitally, is
certain HDMI digital interfaces, at 340MHz. So the digital
RGB transmission, goes a lot faster now than it used to.

If we go back in time for a moment, there is an era that
the video card makers would rather forget. What happened,
is they shipped video cards with DVI digital interfaces
that did not meet spec. For example, the card on this
page, operates up to 141MHz on the DVI interface, supporting
1600x1200. Sometimes, an older driver for the video card,
may limit the maximum resolution, so that the user cannot
see the lack of specification compliance. And in one case,
the driver writer made a mistake in math, such that
a certain resolution could not be selected, when in
fact it would have worked.

http://www.tomshardware.com/reviews/tft-connection,931-18.html

In those kind of pictures, the "pulse template" is the dark
blue colored region. A normalized DVI signal is not allowed
to touch the inner or the outer region in blue. So it
passes a particular test, at a particular clock speed, if
the multicolored part, doesn't touch the dark blue
part. A person running that instrument, would crank
up the video card "resolution" setting, until the
compliance test failed. And that would give the
maximum frequency the interface could handle.

On the video card, the main chip is called the GPU. It
can drive the DVI cable directly. On those early
GPU chips, the ones that couldn't make it all the way
to 165MHz, you could blame that on the available
silicon technoloqy. After all, the output driver
would have to run at 1650 megabaud, with bandwidth
to spare, and that is a high requirement.

If the video card company needed a second DVI output,
in some cases they would use a separate external
chip from Silicon Image. For example, a SIL164 is a
TMDS transmitter, and unlike the GPU, is more likely
to be spec compliant. In other words, when an older
card uses an external DVI transmitter chip, it can
then make it all the way to 165MHz, as it is supposed to.

So that is a little trivia about DVI on older video cards.
The video card may not have an actual spec sheet, saying
it is non-compliant. The company involved doesn't want you
to know that. The driver may "magically" restrict the
output resolution on DVI to 1600x1200, when it should
go to 1920 x 1080 at 60Hz refresh.

So for that particular vintage of older card, the VGA
output may be able to go to 2048 x 1536, while the
DVI may go to 1600 x 1200.

The monitor native resolution of 1680x1050, is hopefully,
the number of pixels on the screen. The monitor works its
sharpest, when driven at that resolution. Being multisync,
and capable of resampling, it is also possible for the
monitor to work the cable interface, at other resolution
settings. If you drove it at 1280x1024, it would resample
the pixels in some way, to drive the 1680x1050. That
cannot be done without some visual artifacts. This is not
an issue if you're watching a movie on the new screen,
but it can affect your ability to work in Microsoft Word,
with lots of text on the screen.

That is a bit long description, but I hope it prepares you
for some of the issues. Many of the cheaper monitors now,
only come with a DVI connector on them. So you'd at least
want a video card with a DVI connector, as it offers the
possibility of a blemish free picture. VGA will still work,
but as the resolution choice gets higher, blemishes become
more significant. Anything wrong with cables or connector
quality, becomes more apparent.

Finding a PCI card that does everything well now, is pretty hard.
The trick is, to find a newer generation of card, something
which probably won't have DVI problems, and yet still works
properly from an applications perspective.

http://www.newegg.com/Product/ProductReview.aspx?Item=N82E16814131082

HTH,
Paul
 
T

Twayne

Walter said:
I am running Windows XP Sp3. Working fine. My computer is 8 years old
with an Intel IV 1500 MHZ

I use a 19" LED monitor and have set to run at 1024x768 resolution in
Windows desktop properties.
The maximum resolution Win XP will let me run is 1280x1024.

Then you would not be able to drive a monitor at anything any higher
than that with your current video card. 1680 x 1050 is larger than your
card will handle if 1280 x 1024 is its highest setting available.
Basically, the video card has to be able to support the # pixels the
new monitor wants.
It might work OK at your lower resolution, but you would be missing
the advantages provided by the recommended resolution, and it may not be
as clear nor display the sizes of icons etc. as you expect them to be.

At 1500 MHz, and a new card that can handle the recommended resolution
for the monitor, you are also likely to see a noticeable slowdown in the
over system speed.

Then again, everything might look fine and the speed drop not a problem
for you; kind of hard to say.

HTH,

Twayne`
 
S

smlunatick

Resolution is control by a few different things.  First, the video
card "specification" would state the maximum resolution.  The second
is the video card drivers, which will tell your Windows how to access
the maximum resolution for the video card.  The third is the LCD
monitor's internal logic which also stores the resolution settings in
the EDID.

Check with on nVidia's web site for the specifications on the Geforce4
MX 4000 video card and their video card drivers.

"As for the LCD, if you connect it to a DVI connector, on the video
card, you will not be able to set the resolution.  You might be able
to set  the resolution if you can use the VGA port of the monitor, if
there is one."

That's not correct. I have an LCD monitor connected to the DVI port on me
nVidia 9800GT, and I can change the resolution at will. Maybe some are
locked and some aren't?

SC Tom

My Acer 22 inch X223w seems to always revert back to its native
resolution. I
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top