HIGH Screen resolution kills performance in WIN/XP?

F

Frank McCoy

Afaik the only way to do this is to reconfigure each application to
use larger font sizes. That may defeat the gain from using a larger
resolution. Icon sizes can be changed through the standard Windows
'desktop/right click' option and selecting
'properties/settings/advanced'. If you have an ATI graphics card,
you can also do this using ATI Tray Tools utility (free).

Well, actually, most programs pay attention to the Windows Setting.
When using Word, it expands to your display size (one of the few nice
things I have to say about Word); thus making your typing easier to see.

Agent, on the other hand, I had to manually reset all the font sizes to
what I like for this resolution. (Options => Display Preferences =>
Fonts)
 
K

kony

The term ghosting is accepted lingo since it is an excellent metaphor
for describing what occurs in the conditions when it occurs.

Well no, it's not "accepted" because it had already been the
term used to describe a different monitor problem and while
some might say "it looks like a similar kind of problem", so
it's fairly called ghosting, it is only reasonable to use
the term to crudely described perceptions of visual appeal,
not to describe the phenomenon as an understanding of what's
causing it.


Also, to make the point: you can't get ghosting with a static image, it
requires the scene to be in motion since ghosting is caused by the
pixels switching on and off too slowly.

Wrong. Ghosting has been and will always be possible with
excessive analog signal degradation. On a static image.
Being ignorant of what ghosting means as a term does not
mean you can just reject and reuse the term for something
else. That's why terms exist, you can't just use one that
is already taken in the same genre except as mentioned
above, if only loosely used to describe visual perception,
not actual state of output/technology.
 
K

kony

!!!!!NNNOOOOO!!!!!!!
Geesh, CAN'T YOU READ????

I get almost *90 degree* horizontal shift with *NO* degradation.


You do not and even your monitor manufacturer will have to
concede it. As written the horizontal degradation is very
slight compared to vertical but it is nevertheless present.

There is no existing LCD technology that overcomes this.
Your monitor is not made of mythic pixie dust, it can only
perform as well as the sum of the parts.

Perhaps your eyes just can't see the difference, but
measurement tools and other people's eyes can.
 
F

Frank McCoy

In alt.comp.hardware.pc-homebuilt kony said:
You do not and even your monitor manufacturer will have to
concede it. As written the horizontal degradation is very
slight compared to vertical but it is nevertheless present.

There is no existing LCD technology that overcomes this.
Your monitor is not made of mythic pixie dust, it can only
perform as well as the sum of the parts.

Perhaps your eyes just can't see the difference, but
measurement tools and other people's eyes can.

I get *more* degradation with a CRT!
Why?
Because the *thick glass* gets in the way when the angle gets great.
The glass front on an LCD is pretty damned thin in comparison.
The leaded and quite thick glass *grays out* the CRT display far more
than the slight shift in color off-axis on the LCD display.

Yes, there's *always* degradation no matter what technology you use when
not viewed at the design angle. But LCD displays (at least *desktop*
displays) no longer have much of the old problems that made them such a
pain-in-the-butt a few years ago.

If you don't believe me, come here, and I'll give you a side-by-side
demonstration.

Don't tell ME what I see when you aren't here to see it.

Just go down to any computer place, Best Buy, or other store that sells
monitors and TRY one yourself. You'll see you're complaining about a
problem that was pretty much *solved* about two years ago.

Geesh.
I don't know what they did, or how the new technology works ... I just
know it DOES; and the issue of poor off-axis viewing for LCD monitors is
a dead horse!

Quit beating on it.

The issues of poor color depth and contrast are also pretty dead too.
Technology *DOES* advance; whether some people wish to admit it or not.
You'd *think* in a group about building your own computers, people would
*know* that!

Yes, LCD panels *do* have some minor problems left; including
most-especially response time (though that's improving fast also).

But color-depth, contrast, and most-especially off-axis viewing are no
longer issues worth even considering when buying an LCD monitor. (OK
.... Correction: They ARE worth considering if buying an OLDER monitor.)

Native resolution, aspect-ratio, physical size, and response-time ARE.
Also, lifetime, come to think about it.
I'm not too sure of the lifetime of LCD monitors yet; though mine is
doing just fine after over six months now. (I got it in the "Black
Friday" sale last Thanksgiving weekend, for Christmas.)

CRTs *do* have a fairly decent history of holding up fairly well.
With the new TFT displays, who knows?
Nothing *inherently* in the design to make them go pop; but ....
We'll just have to see.
 
H

hummingbird

Well, actually, most programs pay attention to the Windows Setting.
When using Word, it expands to your display size (one of the few nice
things I have to say about Word); thus making your typing easier to see.

Well, I never go near s/w from MS except XP.
I didn't look at all my apps - only those which I thought might need
re-configuring. I guess some of them would be ok.

Agent, on the other hand, I had to manually reset all the font sizes to
what I like for this resolution. (Options => Display Preferences =>
Fonts)

Indeed but I have two instances of it running and then there's my
other apps. EG: In Avant Browser I configured the top toolbar to
contain only those small icons I wanted and to fill up the width using
space bars etc. Under 1280x1024 there's an ugly space across the top
and the icons are v/small. Re-configuring ZtreeWin is a whole new
ball game and requires the .pif font size to be edited and command
line syntax etc.
 
K

kony

I get *more* degradation with a CRT!
Why?
Because the *thick glass* gets in the way when the angle gets great.
The glass front on an LCD is pretty damned thin in comparison.

That could be quite true, any tpyical consumer display is
meant to be viewed straight-on in front and those with
reflective surface then also need the ambient lighting
adjusted more to minimize reflections.
 
F

Frank McCoy

[...]
I guess the full switchover to digital TV, due by next year, is
pushing flat-panel displays more than anything else. That, of
course, means LCD panels *will* get faster, just for TV use if
nothing else ... the main drawback now to LCDs.

I must disagree with you that LCD speed is the issue. Se
http://www.xbitlabs.com/articles/other/display/lcd-parameters.html for a
detailed explanation of why.
Um ... A large part of his discussion about "persistance of vision"
actually turns out to be related to LCD speed versus CRT.

What he doesn't take into account, is the persistence of *phosphor* on
the screen; assuming that the effect seen is purely do to the human eye.

It's not.

The problem he describes with LCD panels, he correctly attributes to the
fact that an LCD pixel is turned on *all the time* during a frame,
versus the temporary showing and fading-out on a CRT, especially when
showing Television pictures.

But, instead of being "persistence of vision" causing the problem, if
you examine it closely and mathematically, the root problem is
"persistence of pixel". When an LCD panel switches pictures (or, more
correctly, when an LCD pixel switches brightness states) the old setting
*remains* and *doesn't change* quickly; because of the delay in changing
states for the LCD itself. Sometimes the remnants of a previous picture
can persist over several *frames*.

As evidence of this, Plasma Panels just don't *have* this problem; even
though they also maintain brightness for the full time of a frame. They
however, switch *instantly* to the new brightness level; and thus don't
have the same problem. If his analysis was right, then Plasma Panels
would have the exact same problem ... and they don't.

No, proper speed in switching *will* solve the problem ... eventually.
Only the switching-speed has to get up to at least three times the
frame-rate before the described "problem" he's covering will go away;
NOT the same rate as the frames are updated. There, he's quite correct,
that raising the speed to frame-rates won't make the problem go away.

The technical "solution" he documents that some manufacturers of LCD
televisions are doing: Blacking out between frames with the backlight,
to my notion causes more problems than it solves; not the least of which
is bringing back flicker ... a defect that LCD and Plasma Panels got rid
of nicely. I certainly would never buy an LCD panel with that sort of
thing making my eyes ache! I'd rather put up with the smearing.
 
F

Frank McCoy

In alt.comp.hardware.pc-homebuilt kony said:
That could be quite true, any tpyical consumer display is
meant to be viewed straight-on in front and those with
reflective surface then also need the ambient lighting
adjusted more to minimize reflections.

Sometime look up what goes into a modern color CRT.
Especially to increase contrast and minimize reflection.
They *waste* over 3/4 of the brightness by adding black surrounds,
darkening the glass, and other tricks so that while actually reducing
brightness of the tube, the output *looks* brighter in comparison to
unwanted reflections. They make the dots smaller (dots, not pixels.
Pixels are made up of many dots) and the black area bigger; simply
because the phosphor, when unexcited, is white and reflects room
lighting. So, they then have to excite the phosphor that much more in
compensation. However, CRTs and their new electron guns are well up to
the job; so the customer never notices; simply seeing what *looks* like
a brighter picture, when it isn't. It just has more contrast to the
black background and darkened glass on the front of the tube.

Modern TV screens have the glass an actual dark-gray, for this reason:
Ambient light goes through the glass TWICE; and the dark gray of the
glass will dampen light going through it by more than half each
direction in some models. That reduces reflected glare by up to three
times. The light going out however, only gets reduced one time; so it
ends up *looking* twice as bright ... In comparison, of course to the
reflection. Look also at the surface of most TVs these days. They are
*not* the smooth "glassy* surface you might expect with what's known as
"specular" reflection. Instead they're textured to reduce reflections;
and might even have real anti-reflection coatings. (That last, I'm not
sure of. It's an expensive procedure for something mass-produced like
CRTs.)

CRTs are *amazing* things.
That's why it's taken so long to replace them. They just kept getting
BETTER all the time; and a stern-chase is always a long one.

Most of the early problems color CRTs had with bad purity, convergence,
alignment, and poor showing in bright rooms, have, while not exactly
been "solved"; certainly have so much improvement that nobody even
notices the small remaining defects they still have ... Until, of
course, they're compared side-by-side with a digital panel display; and
such things as misconvergence, minute variation in focus, and such
defects become obvious ... Though, to many people, only when viewed
under at least medium magnification.

LCD panels still have their own problems; as to Plasma Panels.
However, their improvement rate is *staggering*; while CRTs have pretty
much got about as much improvement in as they can. There's little left
in the technology that hasn't already been tried.

I don't really expect LCD panels or Plasma either to be the display of
the future. Some kind of FED display will *eventually* take over ...
once they get something that actually emits and does so reliably over
long periods of time. Of course, by then, perhaps another dark horse
will come along and sweep the field. OLEDs, perhaps?
http://en.wikipedia.org/wiki/OLED

Whatever. In any case, digital and flat-panel displays are here to
stay; and I don't really expect the CRT to last much longer than another
decade, if that. The new displays are getting cheaper, faster, and
better each day; and some day will replace cathode-ray-tubes the way
semiconductor memory has long-since replaced core.

(THAT, BTW, took decades longer than some people predicted.)
Eventually however, everything will be digital; just like music went
from analog scratches in a plastic record to digital dimples in a
plastic CD.
 
F

Frank McCoy

Well, I never go near s/w from MS except XP.
I didn't look at all my apps - only those which I thought might need
re-configuring. I guess some of them would be ok.



Indeed but I have two instances of it running and then there's my
other apps. EG: In Avant Browser I configured the top toolbar to
contain only those small icons I wanted and to fill up the width using
space bars etc. Under 1280x1024 there's an ugly space across the top
and the icons are v/small. Re-configuring ZtreeWin is a whole new
ball game and requires the .pif font size to be edited and command
line syntax etc.

All I can say is:
Do it, and you'll find the decrease in eye-strain with higher resolution
fonts WELL worth the extra effort.

Rather like switching from DOS screen-fonts to True-Type.

The finer grain of the resulting fonts is *much* easier on the eye.
Once, of course, you do get everything changed over.
 
E

Ed Medlin

DRS said:
Mr.E Solved! said:
kony wrote:
[...]
As an aside, I do wish people would stop referring to motion
artifacts as ghosting. The VESA Flat Panel Display Manual defines
ghosting as the problem of interference over the signal, resulting
in an "echoed" image. It's quite different to motion blur.

I agree it would be nice if people didn't refer to it as
ghosting but there are many reviewers who use the term like
that so it keeps getting perpetuated.

The term ghosting is accepted lingo since it is an excellent metaphor
for describing what occurs in the conditions when it occurs.

It already has a defined meaning. By misusing as you did you help to
confuse people not aware of the differences between ghosting and motion
blur.
Also, to make the point: you can't get ghosting with a static image,
it requires the scene to be in motion since ghosting is caused by the
pixels switching on and off too slowly.

No, it isn't. Ghosting is possible with a static image, which is why you
shouldn't use it to refer to motion blur.
Anyone here who remembers trying to adjust an indoor TV antenna to get rid
of that "double" image will know what ghosting is. At least that is what I
always considered ghosting. Motion blur is what you see (more and more
rarely as LCD monitors are improving) when gaming on LCDs with fast moving
objects.


Ed
 
F

Frank McCoy

Anyone here who remembers trying to adjust an indoor TV antenna to get rid
of that "double" image will know what ghosting is. At least that is what I
always considered ghosting. Motion blur is what you see (more and more
rarely as LCD monitors are improving) when gaming on LCDs with fast moving
objects.

"Ghosting" is now more properly called, "Multipath reception"; and
applies to both TV and FM signals.

Digital TV is removing that problem. With digital TV, either you get
the station with *good* reception; or you don't see anything except
sporadically. Makes it a bit difficult to adjust a TV antenna; though
the digital sets usually *do* have a signal-strength bar or "meter" to
help ... somewhere in their "tools" or "channel" menus.

One of the nice thing about analog though was: If you could get even a
very WEAK signal, you could still watch a snowy picture. No more.
 
K

kony

"Ghosting" is now more properly called, "Multipath reception"; and
applies to both TV and FM signals.

It just happens that in a different discipline,
computers/monitors, it has a different cause but similar
result still. If it hadn't already been a standard computer
term it might be applied to LCD artifacts but since it is,
reusing the term in same discipline deviates from the
intention of using a term for a defined meaning.
 
B

Brooks Moses

Frank said:
Um ... A large part of his discussion about "persistance of vision"
actually turns out to be related to LCD speed versus CRT.

What he doesn't take into account, is the persistence of *phosphor* on
the screen; assuming that the effect seen is purely do to the human eye.

It's not.

I'm reminded of the old black-and-white 12" CRT I was playing around
with recently. It's limited to 640x480 at 60Hz, and when I realized
that it wouldn't go any faster, I immediately figured it would be
terribly annoying to use -- I'm one of the people who can see 60Hz
flicker, and it gives me a headache pretty quickly.

But not with this CRT. Its phospors have a remarkably long delay time,
which means that at 60Hz it looks rock steady. On the other hand, the
mouse cursor leaves quite clear trails when it's moving around.

- Brooks
 
F

Frank McCoy

In alt.comp.hardware.pc-homebuilt Brooks Moses
I'm reminded of the old black-and-white 12" CRT I was playing around
with recently. It's limited to 640x480 at 60Hz, and when I realized
that it wouldn't go any faster, I immediately figured it would be
terribly annoying to use -- I'm one of the people who can see 60Hz
flicker, and it gives me a headache pretty quickly.

But not with this CRT. Its phospors have a remarkably long delay time,
which means that at 60Hz it looks rock steady. On the other hand, the
mouse cursor leaves quite clear trails when it's moving around.
Uhuh. My LCD as well, looks rock-steady; and the hand doesn't even
flicker when moved in front of it (like it does in front of a CRT).

However, the mouse *does* live trails behind it. Not long ones, even
compared to the cursor-size; but definite trails. I'd guess about three
frames worth. When moved fast, it seems to stagger a bit from place to
place. That last I'm not so sure is the monitor. It looks more like
what I'd expect from software delays in posting a new position.

Don't really notice either though, in normal use.
Only when I look real hard at the thing.
 
V

Vittorio Janus

Why dont you put a color photo on them side by side.
And then move your head a bit around.
And enjoy the horrible color depth of the LCD.

How old are the LCDs you are looking at? I can remember that as a
major problem when I saw the Crimean War Exibition that the War Museum
in London put on but that was a couple of years ago and the screens
were pretty old then.

I can promise you that there is no such problem on my Samsung
SyncMaster 214T - no even at really stupid angles of vision.

Regards,
vj
 
E

Ed Medlin

Vittorio Janus said:
How old are the LCDs you are looking at? I can remember that as a
major problem when I saw the Crimean War Exibition that the War Museum
in London put on but that was a couple of years ago and the screens
were pretty old then.

I can promise you that there is no such problem on my Samsung
SyncMaster 214T - no even at really stupid angles of vision.

Regards,
vj

I find the same with my 244T if you take the time to correctly calibrate it.
I found the default settings a bit on the bright side. I matched the color
settings with my laptop for photo editing and I love this thing. LCDs have
come a long way in the last few years especially with color depth and
viewing angle issues.

Ed
 
F

Frank McCoy

I find the same with my 244T if you take the time to correctly calibrate it.
I found the default settings a bit on the bright side. I matched the color
settings with my laptop for photo editing and I love this thing. LCDs have
come a long way in the last few years especially with color depth and
viewing angle issues.
Amen!
I personally like the settings of my LCD panel as it came right out of
the box. It looks almost exactly the same as my previous CRT had.

I suppose I *could* tweak the settings to match my color-laser-printer
instead; but I like the video settings better ... they're more
"lifelike". I suppose if I needed to see what things would look like
when printed very often, I'd probably quickly change my mind on that.
 
E

Ed Medlin

Frank McCoy said:
Amen!
I personally like the settings of my LCD panel as it came right out of
the box. It looks almost exactly the same as my previous CRT had.

I suppose I *could* tweak the settings to match my color-laser-printer
instead; but I like the video settings better ... they're more
"lifelike". I suppose if I needed to see what things would look like
when printed very often, I'd probably quickly change my mind on that.

I do a lot of outdoor photography and carry my laptop out in the field and
may do some basic edititing in the woods (plug it into the aux power plug on
my ATV). I like to get the colors matched for when I get back home and do
any final touchup work. I found that the default settings were just a bit
bright. I just softened it a little and used a color card to match up the
two screens and it is about as close as it gets now. My old eyes really like
this 244T.......:). There is a lot of real estate on this thing.


Ed
 
F

Frank McCoy

In said:
At $700, you might as well spend some more and buy the new 32" Sharp
Gamer 1080P and get a real HDTV out of the deal.
There's some logic to that.
My StarLogic only cost me $200 ... Of course, THAT was a "Black Friday
Special". The cheapest I've seen anything even close since has been
about $260; with most over $300.

But with the Sharp, you only get ONE screen for the extra money.
There are advantages to two screens.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top