Forcing Screen Resolution

G

Guest

| On 12 Oct 2007 03:48:51 GMT, (e-mail address removed)
| wrote:
|
|>| On Wed, 10 Oct 2007 04:00:28 GMT, Grinder
|>|
|>|>I know that someone else has to have had this same problem, but I've
|>|>been unable to find a discussion that's both on point and with a solution.
|>|>
|>|>A new 20" LCD (Westinghouse L2045NV) has been purchased that has a
|>|>native resolution of 1400x1050. There is no monitor "driver" available
|>|>for that particular model, as far as I can tell.
|>|>
|>|>There is no 1400x1050 option for "Screen Resolution" even though the
|>|>graphics card (Radeon 9200) should have plenty of memory to accomplish
|>|>it. I see no way in the (current) version of Catalyst to force a
|>|>specific resolution.
|>|>
|>|>How can I get the system to drive the monitor at its native resolution?
|>|
|>| If there's nothing in the Catalyst Contol Center that will
|>| allow this res., try an nVidia card... just tried it on this
|>| system with one (FX5700) and it supports 1440x1050 as a
|>| custom resolution even with the now old 78.05 Detonator
|>| version I have to use to keep the onboard tuner/capture
|>| drivers happy.
|>
|>I have an old Matrox Millennium video card that lets me run 1440x1050 just
|>fine ... as long as I am willing to accept a lower frame rate due to the
|>fact that this ancient technology didn't have a very high pixel clock :)
|
| So in other words, it doesn't do it just fine.

It is perfectly capable of the geometry. A video card that cannot do the
geometry, whether one of that vintage or one made today, is indicative of
bad engineering. Just pure bad engineering.

Higher clock rates and higher DAC rates are a legitimate area of design
difficulty. That has been worked out over the years and today we do have
higher clock rates at reasonable price points. We didn't back then.

RAM size is also a limitation. But that wasn't an issue even back when
the card I have was made. The Matrox Millennium (1) can do it with 16
bits per pixel. The G400 can do it with 32 bits per pixel.

One area of limitation where there is absolutely no excuse whatsoever is
the logic of the design. There is no reason we cannot have video cards
that can do 65536 x 65536, given enough RAM. There is no reason we cannot
have video cards that can do _any_ combination of vertical and horizontal
resolution, within at least a range of one of not exceeding 32768 or so,
which would need no more than 16 bits each to load the clock dividers.
There is no reason we cannot do any resolution within a range that could
be allowed to quite some extremes.

There's also no reason we can't have a couple more bits on the clock divider
in the LCD monitor ... but that's another thread.

My point is: _artificial_ limitations are a bad thing.


| However, I don't think we've yet determined that the OP's
| video card is definitely the problem, as there is still some
| question as to whether the monitor is handling what it
| receives correctly.

Or whether his driver is the component suffering from bad engineering.
If the driver has no means to allow the user to specify exact geometry
or specific modelines, it is deficient in design. Maybe that can be
done only through the registry? That might be worth checking into.


|>Fortunately, the G450 has sufficient pixel clock frequency to let me get
|>it up to 50.5 Hz vertical. The older Millennium (1) could only get up to
|>about 40 Hz vertical ... which would display just fine in LCD technology
|>(don't try this on CRT), if only embedded software in LCD monitors would
|>allow direct synthesizer divider programming.
|
| New Higher-Res Widescreen LCD - $250
| Used Maxtrox G450 - $5

Cost increase to have designed the LCD to handle down to 23.976 Hz - $5
Savings by not having to buy commercial software from the Northwest - $100's
Benefit that monitor could _also_ have in viewing 1080p24 movies - priceless
(though the movies is not what *I* want it for)


| It would be more cost effective for us all to pay you 20
| cents each to offset the loss of your video card instead of
| bearing the higher cost of implementing this change and
| added support to modern higher-res LCDs... support which
| most of us don't need.

If you can find a video card that works at least as well as the G450 does
in my Linux system, have at it. Hint: that rules out everything made by,
or with chipsets from, both ATI and nVidia (but in several months ATI may
not be in that list ... remains to be seen).

I'm sure you understand video modelines. Work out what it takes to drive
a high resolution LCD monitor and figure out the needed dot clock to get
the vertical rate the monitor is picky about. Just make sure you know the
CPU instruction steps needed to _load_ those modelines (and to load fonts
in the case of text mode).
 
K

kony

| So in other words, it doesn't do it just fine.

It is perfectly capable of the geometry. A video card that cannot do the
geometry, whether one of that vintage or one made today, is indicative of
bad engineering. Just pure bad engineering.

But in the end, it doesn't do fine as you already conceded
any normal monitor would have to be reengineered to
downgrade itself.

My point is: _artificial_ limitations are a bad thing.

So is trying to reuse an old video card for something beyond
it's capability.
| New Higher-Res Widescreen LCD - $250
| Used Maxtrox G450 - $5

Cost increase to have designed the LCD to handle down to 23.976 Hz - $5

That seems likely to be a number made up out of thin air.
Just redesigning the monitor, programming new firmware,
taking down production line and redoing it all would likely
cost more than $5 per without even considering the actual
hardware change.


If you can find a video card that works at least as well as the G450 does
in my Linux system, have at it. Hint: that rules out everything made by,
or with chipsets from, both ATI and nVidia (but in several months ATI may
not be in that list ... remains to be seen).

Lots of people use nVidia on linux, you might ask them how
they do what you need done.
 
G

Guest

| On 12 Oct 2007 13:17:21 GMT, (e-mail address removed)
| wrote:
|
|
|>| So in other words, it doesn't do it just fine.
|>
|>It is perfectly capable of the geometry. A video card that cannot do the
|>geometry, whether one of that vintage or one made today, is indicative of
|>bad engineering. Just pure bad engineering.
|
| But in the end, it doesn't do fine as you already conceded
| any normal monitor would have to be reengineered to
| downgrade itself.

I wasn't even talking about the need to support a wider range of frame
rates in my statement you quoted. I was saying that any video card that
cannot adjust its geometry to any geometry, or at least all the standard
and common ones, within its rate, is badly engineered.

As for downgrading of monitors, you clearly do not yet understand that
expanding the frequency range of the analog to digital simpling clock
is not a downgrade. A downgrade would be if it were changed in such a
way that some frequency it otherwise could have done cannot now be done
as a result of a change. That kind of change is not what is needed to
support the cases I was talking about in another thread, which you do
not seem to fully grasp. The kind of change need to support them is to
_expand_ the range of frequencies, plus a wee bit of software change:

instead of:
if ( vert_hz < 50.0 || vert_hz > 120.0 ) {
display_error( "video out of range" );
}

change the software to:
if ( vert_hz < 20.0 || vert_hz > 240.0 ) {
display_error( "video out of range" );
}

With such changes, a monitor can now handle more cases than before, and
still handle every case it could have before. That's not a downgrade.



|>My point is: _artificial_ limitations are a bad thing.
|
| So is trying to reuse an old video card for something beyond
| it's capability.

Even the "old" video cards _could_ do what the OP wanted.

Today's video cards might look great when you read specs. But they have
some serious problems, particularly in the closed interface designs and
poorly coded driver software. While we might well have faster GPUs and
more RAM on board, the quality of things like drivers and other software
is going down.

There is absolutely no excuse whatsoever for any driver to refuse to allow
setting any user supplied video geometry within its range (which is a very
wide range), as well as no excuse for it to no default to using the geometry
that is native to the monitor by default when none has been specified. Yet
these problems continue to persist.

The quality of video cards and their drivers _is_ going down in areas other
than what the GPU and clock speeds are, and RAM size. Those factors are
the only basis for choosing a video card.


|>| New Higher-Res Widescreen LCD - $250
|>| Used Maxtrox G450 - $5
|>
|>Cost increase to have designed the LCD to handle down to 23.976 Hz - $5
|
| That seems likely to be a number made up out of thin air.
| Just redesigning the monitor, programming new firmware,
| taking down production line and redoing it all would likely
| cost more than $5 per without even considering the actual
| hardware change.

If you want to take back and retrofit an old design, of course. But I
never said such a thing. I'm saying that the $5 difference is the most
there would be had this been done DURING THE ORIGINAL DESIGN. It would
be more like $2. The software (see above example C code) could have
been done with zero cost by having just typed different numbers at the
time it was originally done. The ADC clock generator might have needed
to use a different chip that supported a larger divide ratio, but these
are all within a tight price range. A chip with ONE more bit to the
clock divider means the lowest frequency it can reach is half the Hz.

The next monitor product being designed should include such support of a
wider range ... not for me specifically ... but for support of 23.976 and
24 fps video standards that do exist. Such support would have been costly
for a CRT and required a video upconverter to a higher frame rate. But
LCD has no such need, so there is virtually no cost involved (LCD "converts"
all video to its own "static format").


|>If you can find a video card that works at least as well as the G450 does
|>in my Linux system, have at it. Hint: that rules out everything made by,
|>or with chipsets from, both ATI and nVidia (but in several months ATI may
|>not be in that list ... remains to be seen).
|
| Lots of people use nVidia on linux, you might ask them how
| they do what you need done.

Merely using nVidia on Linux and actually getting the video card to do what
you want are two different things. So far no one I have talked to that uses
nVidia is able to achieve much beyonds what the video defaults to. That
means that using nVidia on Linux is a "narrow experience".

I got 2 new PCs in at work a few months ago. They had nVidia video chips
built into the motherboard. Ubuntu would not go above 640x480. Fedora
managed to get 1024x768. Text console would not go above 80x50. They
were sent back to IT. Then came a couple with ATI video chips. At least
a reverse engineered driver was available (not the crap from ATI) for Xorg.
Text mode was still stuck at 80x50. So I'm basically using the new PCs as
ssh-connected build engines and have 2 older PCS with G450 running Xorg at
1400x1050 (the Acer 20" LCD has its sampler clock at the very lowest Hz it
can go and I boosted the video card has high as I could get it and it just
made it with spot-on pixels). Text mode is at a nice 144x88 and still fast
enough at scrolling to not slow me down (text under X is too slow for my
fast programming style).

Let me know when a video card comes out that _is_ quality in _all_ aspects
of operation and usage.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top