Curious discovery about lcd monitors

J

johns

This is a Conspiracy Theory I have just stumbled on, but
I have some agreement from other web sites discussing
this. I have a Viewsonic 2025 wide screen lcd monitor
that has a native resolution of 1680x1050 or 16:10 as
opposed to the standard 4:3 resolution of non-wide
lcd monitors. I'm driving it with an nVidia GF7900GTO
video card ( one of the best ), and I'm using the driver
dated Feb 06. That driver allows me to set custom
resolutions that I calculate to keep the same 16:10
proportions .. like 1280x800. That works fine, and I
use Coolbits to set it. All my apps and games look
great, and I have a very high frame rate. For certain
reasons, I decided to update my video driver to the
latest 91.xx . I tried both the Sept 06, and the Nov
06 drivers, and discovered in both that I can no longer
set in a 16:10 custom resolution, and get full screen.
Instead I get a 3 inch high black bar all the way
across the bottom of the screen, and nothing will
fix that. I called tech support, and they are seeing it
too, and have no fix. The Conspiracy Theory is this:
That truncated screen is actually an HDTV setting,
and the new nVidia 91.xx drivers are no longer
supporting "wide-screen" lcd monitors .. except
for the native resolution. Instead, those drivers are
attempting to force "us" to replace our monitors
with HDTVs, and just use THEM as monitors.
The new resolutions in the 91.xx drivers are for
HDTVs, and our wide screen market is closing
out in favor of the huge upcoming move to digital
TV, and "blue" DVD at the 16:9 resolutions. An
even bigger hint that this theory might be true
is Vista requiring a tv-tuner in order for it to do
a full install. And I'm seeing the $79 per month
offer that the cable people are giving for both
cable-network and cable-digitalTV, tossing in
HBO, and a few more HDTV stations that actually
use the 16:9 broadcast resolution ... straight to
your desktop. NVidia absolutely will not admit
it, and they actually claim that my monitor or
video card is broke. Beyond stupid when it runs
super good on the older ( Feb 06 ) driver. And,
all for free, I DO have a tv-tuner, and I run TV
in the HDTV modes, or any mode I want .. on
my present 2025W lcd monitor. Bite me! NVidia.

johns
 
T

Tomcat (Tom)

Well, I do think the line between HDTV's and PC monitors is getting
fuzzy as many HDTV's now support 1080p resolution, but in no way do I
think there is any effort by graphics card makers to move away from
supporting PC monitors. The fact is, most PC's are still connected to
monitors not TV's and it will continue to be that way for a long time.
Only 25-30 percent of households even have an HDTV, let alone have it
hooked up to their PC.
 
M

Mike T.

johns said:
This is a Conspiracy Theory I have just stumbled on, but
I have some agreement from other web sites discussing
this. I have a Viewsonic 2025 wide screen lcd monitor
that has a native resolution of 1680x1050 or 16:10 as
opposed to the standard 4:3 resolution of non-wide
lcd monitors. I'm driving it with an nVidia GF7900GTO
video card ( one of the best ), and I'm using the driver
dated Feb 06. That driver allows me to set custom
resolutions that I calculate to keep the same 16:10
proportions .. like 1280x800. That works fine, and I
use Coolbits to set it. All my apps and games look
great, and I have a very high frame rate. For certain
reasons, I decided to update my video driver to the
latest 91.xx . I tried both the Sept 06, and the Nov
06 drivers, and discovered in both that I can no longer
set in a 16:10 custom resolution, and get full screen.
Instead I get a 3 inch high black bar all the way
across the bottom of the screen, and nothing will
fix that. I called tech support, and they are seeing it
too, and have no fix. The Conspiracy Theory is this:
That truncated screen is actually an HDTV setting,
and the new nVidia 91.xx drivers are no longer
supporting "wide-screen" lcd monitors .. except
for the native resolution. Instead, those drivers are
attempting to force "us" to replace our monitors
with HDTVs, and just use THEM as monitors.

(snip)

First, you need to understand that what is output from your video card is
more than just an arrangement of pixels into a squarish or rectangle-ish
shape. If your monitor could display 1280x800 (to YOUR liking) at one time
or another, then it still can, unless it is now defective. So you have to
answer, WHAT ELSE changed (besides the driver version) when you updated the
video drivers? Or put more simply, garbage in = garbage out.

Previously, your LCD could parse 1280 X 800 (according to you). But, what
was the refresh rate at that resolution? It's somewhat likely that after
you upgraded the video card drivers, if you select 1280 X 800 custom
resolution, the output of the video card is now at a higher refresh rate,
and the monitor (Viewsonic 2025) doesn't handle this specific combination of
vertical/horizontal/refresh as elegantly as it did at a lower refresh rate.

Or less likely, the older video drivers were running your custom resolutions
at a higher refresh rate.

Whatever, the output of your video card has changed in a way that your
monitor doesn't handle it as gracefully as you would like it to. I think
the newer drivers though are a red herring. That is, if you were to set
your video card to the EXACT output that worked fine under the older
drivers, you'd probably find that the newer drivers work, also.

As for your conspiracy theory (someone crippling drivers, forcing you to buy
an HDTV instead of a computer monitor) . . . if such a conspiracy existed,
it would be focused on lcd computer monitors with greater resolution than a
Viewsonic 2025. The native resolution of that monitor is not capable of
displaying HDTV content. Theoretically, you could use it to display 720P.
But at best, this would be a blurry/distorted approximation of 720P, or it
would not cover the whole viewable area of the screen. 1080i/P would be
even worse on that Viewsonic 2025, as it would be distorted AND scaled down,
assuming that your monitor would even -attempt- to display 1080x HDTV input.
Not all monitors will accept input resolutions higher than max or native.
1080i/p HDTV input would exceed the physical limitations of your monitor,
though it *might* try to display it anyway (greatly distorted and scaled
down). No, nobody has to cripple your monitor with funky video card drivers
to get you to favor an HDTV to display HDTV content... the Viewsonic 2025
has significant HARDWARE limitations, as far as displaying HDTV content
goes.

Before you get too carried away pushing this conspiracy theory, you should
talk to owners of monitors with native resolution of 1920 X 1200. If those
users can't display 1080i/p due to newer video card drivers crippling HDTV
resolutions, then you might have something. :) -Dave
 
M

Mike T.

<verbiage snipped>

I don't think it was a troll, I think it was just someone confused enough to
think that any LCD computer monitor should be capable of displaying HDTV
content, and that newer drivers must have been deliberately crippled
somehow, to not allow custom resolutions. Silly, but it's easy to imagine
someone being that confused. -Dave
 
J

johns

I simply went back to the Feb 06 driver, and
everything is happy again ... esp my 1280x800
resolution at full screen. I'm probably one of
the very few out here who can run Gothic 3
with HIGH settings in 16:10 mode. That nails
it down. My hardware it first rate, and nothing
changed except the 91.xx drivers ... which
many sites are saying does not support wide
screen PC monitors. They think it is a bug
that nVidia will fix. It is not. NVidia wants the
digitalTV market, and we can take a hike.

johns
 
J

johns

Previously, your LCD could parse 1280 X 800 (according to you). But, what
was the refresh rate at that resolution? It's somewhat likely that after
you upgraded the video card drivers, if you select 1280 X 800 custom
resolution, the output of the video card is now at a higher refresh rate,
and the monitor (Viewsonic 2025) doesn't handle this specific combination of
vertical/horizontal/refresh as elegantly as it did at a lower refresh rate.

No. I tried 60, 72, and 75, and my monitor supports those just fine
under the Feb 06 driver ... but not the 91.xx drivers.
Or less likely, the older video drivers were running your custom resolutions
at a higher refresh rate.

It worked at 60 and 75, but I could not see any difference, so
I used 60.
Whatever, the output of your video card has changed in a way that your
monitor doesn't handle it as gracefully as you would like it to. I think
the newer drivers though are a red herring. That is, if you were to set
your video card to the EXACT output that worked fine under the older
drivers, you'd probably find that the newer drivers work, also.

No way, and I had help from BFG tech support on the
phone. We tried everything, and BFG upped the case
to nVidia for an explanation.
As for your conspiracy theory (someone crippling drivers, forcing you to buy
an HDTV instead of a computer monitor) . . . if such a conspiracy existed,
it would be focused on lcd computer monitors with greater resolution than a
Viewsonic 2025. The native resolution of that monitor is not capable of
displaying HDTV content.

NOW THAT IS EXACTLY RIGHT !!!!!!! THINK ABOUT IT !!!!!
That is THE Conspiracy Theory :)

johns
 
J

johns

NVidia is relying on you guys to not be smart
enough to put the pieces together, and realize
that their goal is to provide support for a new
market that is a million times more profitable
than the present one ... merely gamers. They
want in to the digitalTV market, and those
of us with PVRs and wide screen lcd monitors
can already do it for next to nothing. That
undercuts billions of $$$ in profits. NVidia
probably thinks WE will eventually all move
to the consoles for our games, and the market
for the big cards will be gone. Obviously,
nVidia and ATI will go to support of PVRs
under Vista, and so their drivers are already
anticipating that. Oopsy! Maybe a little too
soon ?

johns
 
T

Tomcat (Tom)

johns said:
NVidia wants the digitalTV market, and we can take a hike.
Yes, they want some of the digitalTV market but NOT at the expense of
throwing away support for monitors. TV's still aren't optimal for use
as general use computer monitors and won't be for a long time. TV's
max out at 1920x1080 and that won't change anytime soon. There is s
still big need for high-res (1920 x 1200 and beyond) resolution display
devices for computer applications and video card makers will need to
continue to support them.
 
O

OSbandito

johns said:
This is a Conspiracy Theory I have just stumbled on, but
I have some agreement from other web sites discussing
this. I have a Viewsonic 2025 wide screen lcd monitor
that has a native resolution of 1680x1050 or 16:10 as
opposed to the standard 4:3 resolution of non-wide
lcd monitors. I'm driving it with an nVidia GF7900GTO
video card ( one of the best ), and I'm using the driver
dated Feb 06. That driver allows me to set custom
resolutions that I calculate to keep the same 16:10
proportions .. like 1280x800. That works fine, and I
use Coolbits to set it. All my apps and games look
great, and I have a very high frame rate. For certain
reasons, I decided to update my video driver to the
latest 91.xx . I tried both the Sept 06, and the Nov
06 drivers, and discovered in both that I can no longer
set in a 16:10 custom resolution, and get full screen.
Instead I get a 3 inch high black bar all the way
across the bottom of the screen, and nothing will
fix that. I called tech support, and they are seeing it
too, and have no fix. The Conspiracy Theory is this:
That truncated screen is actually an HDTV setting,
and the new nVidia 91.xx drivers are no longer
supporting "wide-screen" lcd monitors .. except
for the native resolution. Instead, those drivers are
attempting to force "us" to replace our monitors
with HDTVs, and just use THEM as monitors.
The new resolutions in the 91.xx drivers are for
HDTVs, and our wide screen market is closing
out in favor of the huge upcoming move to digital
TV, and "blue" DVD at the 16:9 resolutions. An
even bigger hint that this theory might be true
is Vista requiring a tv-tuner in order for it to do
a full install. And I'm seeing the $79 per month
offer that the cable people are giving for both
cable-network and cable-digitalTV, tossing in
HBO, and a few more HDTV stations that actually
use the 16:9 broadcast resolution ... straight to
your desktop. NVidia absolutely will not admit
it, and they actually claim that my monitor or
video card is broke. Beyond stupid when it runs
super good on the older ( Feb 06 ) driver. And,
all for free, I DO have a tv-tuner, and I run TV
in the HDTV modes, or any mode I want .. on
my present 2025W lcd monitor. Bite me! NVidia.

johns

When you say nothing will fix the black bar--which tech support did you
call, graphics card or monitor? If I understand the situation, there's
nothing wrong with the hardware or the drivers. It sounds like a simple
monitor-front (buttons) panel adjustment for height, width and linearity
after adjustments on the graphics control panel have been made. Find a
res/scan on the graphics cont'l panel which produces a roughly linear
shaped/sized image for the screen then go to the front-panel adjustments
on the monitor.
http://www.anandtech.com/displays/showdoc.aspx?i=1855&p=12
 
J

johns

I thought so too, but on the Viewsonic 2025wide, the manual
screen resolution is greyed out ????? It is THERE, but not
active. When I try to select it with the buttons, the cursor
jumps over it. Nothing in the handbook about it either.
I imagine what I'm trying to do is available, but in a way
I've never heard of ... like setting in the HDTV resolutions,
and scaling them ??? So far, I"ve talked to BFG, Aspyr,
nVidia forum, and several sites talking about driver problems
and interaction with video card problems. One site talks
about how the 7900GTX was overclocked in a way that
parts of the card could not keep up with other parts of
the card, and that produced screen lags and lockups.
The solution there was to slow the card down. I have
the 7900GTO card. Maybe it has a similar problem
if it is pushed too hard in a game like Gothic 3. Still
that does not explain the resolution problems, or why
nVidia has chosen to drop custom 16:10 settings in
their new drivers.

johns
 
P

Phisherman

Well, I do think the line between HDTV's and PC monitors is getting
fuzzy as many HDTV's now support 1080p resolution, but in no way do I
think there is any effort by graphics card makers to move away from
supporting PC monitors. The fact is, most PC's are still connected to
monitors not TV's and it will continue to be that way for a long time.
Only 25-30 percent of households even have an HDTV, let alone have it
hooked up to their PC.


My computer has an LCD and a CRT (two monitors). I have a theater
room with a plasma display and 7.1 channel sound. My computers are
networked, but I have yet to get my theater and computers tied
together. The current link is still DVDs.
:(
 
J

johns

I have a PVR that will do everything in one system
that takes all of your systems to do ... and mine
costs less than $2k. Those companies out there
would like to part it out to specialized devices for
TV, gaming, and production work, and sell us all
3. I have all of that, and more, in one system.

johns
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top