which monitor has high resolution?

J

J David Ellis

Currently using a Dell Inspiron E1705 that has a 1920x1200
19" (diagonal) display. I'd like to buy a 22" or 23"
flat-panel computer monitor that has no less pixel density
than the E1705. Does any manufacturer offer one?
--David
 
C

Christopher Quigley

J David Ellis said:
Currently using a Dell Inspiron E1705 that has a 1920x1200 19" (diagonal)
display. I'd like to buy a 22" or 23" flat-panel computer monitor that has
no less pixel density than the E1705. Does any manufacturer offer one?
--David

It is difficult enough now even to find a 22" desktop monitor at 1920x1200
(Lenovo's L220x) or 23" (Apple's M9178LL/A), never mind a higher density
monitor at that size. What you are looking for did exist, but it was not
cheap, especially not early on. According to
http://en.wikipedia.org/wiki/T221, IBM produced the T220 and later the T221
in 2001, which are WQUXGA (3840×2400) resolution and 22.2". There is a user
group at http://tech.groups.yahoo.com/group/IBM_T2X_LCD/. Rebadged versions
were produced by Iiyama, ViewSonic and IDTech. They are no longer for sale
new, but there are some used models available.
 
M

Mike Ruskai

Currently using a Dell Inspiron E1705 that has a 1920x1200
19" (diagonal) display. I'd like to buy a 22" or 23"
flat-panel computer monitor that has no less pixel density
than the E1705. Does any manufacturer offer one?

That's the highest resolution you're going to see until you go to 30",
where it's 2560x1600. So, any larger widescreen monitor is going to
have larger pixels than you currently have, though it's likely you
won't notice unless you try.
 
R

rjn

J David Ellis said:
Currently using a Dell Inspiron E1705 that has a 1920x1200
19" (diagonal) display. I'd like to buy a 22" or 23"
flat-panel computer monitor that has no less pixel density
than the E1705. Does any manufacturer offer one?

1920 on a 19in is 119 dpi, which is well beyond the 100 dpi
threshold of "unusable" for Windows users, due to assumptions
made by Mr.Bill long ago, and hard-coded into too many
legacy apps, icons, bit-mapped system fonts, etc.

1920 is also as high as single-link DVI can go without
tricks that no one wants to use so far. Tolerable LCDs
with dual-link just only hit the market two weeks ago.
See the "what's the deal at 30in 2560" thread here.

Consequently, the market for large >100 dpi tends to be fairly
specialized, is way above 100 dpi, and is always at least dual-link.

The legendary 22in 3840x2400 IBM T221, still available as
some obscure Toshiba model number, is 200 dpi quad-link.
It's also the price of a mid-size car.
<http://www.theinquirer.net/gb/inquirer/news/2007/11/23/toshiba-mpixel-
lcd-recycling>
 
J

J David Ellis

J David Ellis wrote:

Thank you Messrs. Quigley, Ruskai and Niland for your
thoughtful replies, and the bad news.

You've pushed this hi-res neophyte's education forward
considerably.

--David
 
C

chrisv

rjn said:
1920 on a 19in is 119 dpi, which is well beyond the 100 dpi
threshold of "unusable" for Windows users, due to assumptions
made by Mr.Bill long ago, and hard-coded into too many
legacy apps, icons, bit-mapped system fonts, etc.

1920 is also as high as single-link DVI can go without
tricks that no one wants to use so far. Tolerable LCDs
with dual-link just only hit the market two weeks ago.
See the "what's the deal at 30in 2560" thread here.

Consequently, the market for large >100 dpi tends to be fairly
specialized, is way above 100 dpi, and is always at least dual-link.

It's a CRT, but monitors made with Sony's FD Trinitron tube (such as
the F500 and F520) can do 115 dpi (25.4/0.22=115)
 
B

Bob Myers

It's a CRT, but monitors made with Sony's FD Trinitron tube (such as
the F500 and F520) can do 115 dpi (25.4/0.22=115)

Based solely on the dot (or in this case, stripe) pitch,
you would think that, but on a CRT the resolution capability
is generally limited by the beam (spot) size and video amplifier
bandwidth/risetime well before you reach the pitch limit.

Bob M.
 
R

rjn

chrisv said:
It's a CRT, but monitors made with Sony's FD Trinitron tube (such as
the F500 and F520) can do 115 dpi (25.4/0.22=115)

Sony actually built and demo'd (at COMDEX) a .15mm
dot pitch Trinitron. That works out to 169 dpi, plus or
minus the Myers derating factors. They demo'd it with
static images. Windows would have been microscopic.

Sony never released the .15, due, in large part I suspect, to
the Windows icon fixed-font problem, which was even more
severe back in the COMDEX era.

Do not go beyond 100 dpi unless you've already spent
some time there on the intended platform.

"Photo quality" is roughly considered to be 200 dpi 24-bit.
If the OS/app issues can be resolved, 200 might be an
attractive monitor resolution in some future not nearby.
 
M

Mike Ruskai

1920 is also as high as single-link DVI can go without
tricks that no one wants to use so far. Tolerable LCDs
with dual-link just only hit the market two weeks ago.
See the "what's the deal at 30in 2560" thread here.

Just what do you claim is intolerable about the 30" panels that have
been on the market for more than a year, some even two?
 
R

rjn

Mike Ruskai said:
Just what do you claim is intolerable about the 30" panels that have
been on the market for more than a year, some even two?

As I said in the referenced thread, titled:
" LCD: what's going on at 30in 2560?"

What are the issues?
1. DVI-D only - no VGA or analog TV
2. Dual link DVI only
3. No scaler, even for digital
4. No HMDI either
5. No on-board setup controls

Obviously, some people are tolerating this.
And some are not, as the Inq rant mentioned
in the thread confirms. Personally, I wouldn't
buy one, and now that Dell's 3rd try has fixed
these problems, these 30inchers with issues
are apt to vanish from the market.
 
B

Bob Myers

What are the issues?
1. DVI-D only - no VGA or analog TV

Ignoring the scaler question for the moment....

2560 x 1600 @ 60 Hz is at least a 270 MHz pixel rate,
given an LCD-reasonable amount of blanking time.
You REALLY don't wanna try that over a VGA...
2. Dual link DVI only

Ditto; single-link DVI tops out at 165 MHz, 8 bits/color.

Bob M.
 
R

rjn

Bob Myers said:
Ignoring the scaler question for the moment....

2560 x 1600 @ 60 Hz is at least a 270 MHz pixel rate,
given an LCD-reasonable amount of blanking time.
You REALLY don't wanna try that over a VGA...

Hitting the max res of a 2560 over VGA is not the point.
The point is being able to hook it to any arbitrary old PC,
at any res, and get something on the screen.

I presently have a Windows PC driving my 23in LCD at
1920x1200 over DVI, and a Linux PC driving the VGA
port at 1920x1200 analog. If I replaced this LCD with a
2560, I'd want something similar, even 2560 on DVI-DL
and 1920 on analog. Can't do that with the pre-Q4-07
30in 2560s.
Ditto; single-link DVI tops out at 165 MHz, 8 bits/color.

It could go higher than 1920 at lower frame rates.
Given that refresh flicker is not an issue on LCD,
a 30Hz rate would be fine for many apps (DTP, CAD),
but the graphics card makers don't seem to be inclined
to emit below 60.

And all of this is pretty much moot, now that the tech
has caught up with obvious expectations.
 
B

Bob Myers

rjn said:
It could go higher than 1920 at lower frame rates.
Given that refresh flicker is not an issue on LCD,
a 30Hz rate would be fine for many apps (DTP, CAD),
but the graphics card makers don't seem to be inclined
to emit below 60.

Well, actually going much below 60 doesn't really
work all that well on most LCDs. Flicker as we knew
it in CRTs isn't an issue, but there ARE some timing
concerns within the LCD that cause most manufacturers
to limit the native rates of the modules themselves. On
just about any LCD monitor in production today, if the
input frame rate is off from 60 Hz by more than a few
Hz, there's a frame-rate conversion being done. Some,
but by no means all, LCDs (and by this I mean the
panel or module itself, not the complete monitor) might
get down to 50 Hz on their own, but it's really rare to
see something that will work much below that. The one
exception I can think of off the top of my head was that
9.2 MPixel LCD that IBM did a few years back - that
one had a native rate of somewhere around 45 Hz, as I
recall, just because there was no good way to ship around
that much video at anything faster.

Bob M.
 
N

Not Gimpy Anymore

Bob Myers said:
Well, actually going much below 60 doesn't really
work all that well on most LCDs. Flicker as we knew
it in CRTs isn't an issue, but there ARE some timing
concerns within the LCD that cause most manufacturers
to limit the native rates of the modules themselves. On
just about any LCD monitor in production today, if the
input frame rate is off from 60 Hz by more than a few
Hz, there's a frame-rate conversion being done. Some,
but by no means all, LCDs (and by this I mean the
panel or module itself, not the complete monitor) might
get down to 50 Hz on their own, but it's really rare to
see something that will work much below that. The one
exception I can think of off the top of my head was that
9.2 MPixel LCD that IBM did a few years back - that
one had a native rate of somewhere around 45 Hz, as I
recall, just because there was no good way to ship around
that much video at anything faster.

Bob M.

Fully agree with Bob - The panel makers are still king, and
the rest of us, whether integrators, solution providers, resellers,
or whomever, are stuck with their decisions. To make it a bit
worse, the panel companies seem to take pride in how well they
can "drive the market", as opposed to listening to the market
needs.

NGA
 
R

rjn

Bob Myers said:
Well, actually going much below 60 doesn't really
work all that well on most LCDs.

The popular [mis]conception among those who have
thought about it, but not read your book :), is that LCD
panels are a form of write-only frame buffer. You send
a data set to a pixel triad, and it stays at that value
until you send another data set, whenever.
Flicker as we knew it in CRTs isn't an issue, but there
ARE some timing concerns within the LCD that cause
most manufacturers to limit the native rates of the modules
themselves. On just about any LCD monitor in production
today, if the input frame rate is off from 60 Hz by more
than a few Hz, there's a frame-rate conversion being done.

Is there usually/sometimes/never a real frame buffer
in that path? If present, it "should" completely decouple
what the panel needs from what the host wants to send.

RAM is relatively cheap, but the monitor makers can save
a few cents by making (and enforcing) some assumptions
about the signal envelope, I suppose they do.

Which makes it less likely than ever that avoiding dual-link
will be accomplished by using low host frame rates.

The other work-around gimmick that is even less likely is
data reduction (e.g. sending only changed pixels).
 
N

Not Gimpy Anymore

rjn said:
Bob Myers said:
Well, actually going much below 60 doesn't really
work all that well on most LCDs.

The popular [mis]conception among those who have
thought about it, but not read your book :), is that LCD
panels are a form of write-only frame buffer. You send
a data set to a pixel triad, and it stays at that value
until you send another data set, whenever.
Flicker as we knew it in CRTs isn't an issue, but there
ARE some timing concerns within the LCD that cause
most manufacturers to limit the native rates of the modules
themselves. On just about any LCD monitor in production
today, if the input frame rate is off from 60 Hz by more
than a few Hz, there's a frame-rate conversion being done.

Is there usually/sometimes/never a real frame buffer
in that path? If present, it "should" completely decouple
what the panel needs from what the host wants to send.

Short answer is "hardly ever"... Only added if it is considered
necessary to the business to support different frame rates
(like content intended for film, at 24 Hz multiples). Us
"integrators" tend to follow instructions given by our (direct)
customers (solutions providers), and allow the solutions providers
to deal with the end user issues.... Not very pretty, but not much
choice either.
RAM is relatively cheap, but the monitor makers can save
a few cents by making (and enforcing) some assumptions
about the signal envelope, I suppose they do.

Absolutely they do skimp - if you think PC margins are slim,
the margins on peripherals are smaller ... pennies make a
difference.
I often say that the panel makers are the only ones making
any profit at all, and even that is largely supposition, since they
do so well at keeping their financials disguised. The only concrete
clue is that they are still in business.......
As to the "enforcing", again it's really the panel companies
who do that, when it comes to how we will drive their panels...
Oh, sure, they try to be "open" by participating in standards
activities, but in the end, they rule.
Which makes it less likely than ever that avoiding dual-link
will be accomplished by using low host frame rates.

The other work-around gimmick that is even less likely is
data reduction (e.g. sending only changed pixels).

Well, as usual, in a "closed system" anything is possible,
but in open systems like PC's, we are always faced with
standards (either established, or defacto) that tend to limit
our flexibility. Even having said that, many closed systems
today still balk at the idea of adding expense if it only seems
to benefit relatively few....
But then I'm "preaching to the choir" aren't I?

NGA
 
M

Mike Ruskai

As I said in the referenced thread, titled:
" LCD: what's going on at 30in 2560?"

What are the issues?
1. DVI-D only - no VGA or analog TV
2. Dual link DVI only
3. No scaler, even for digital

Dual-link DVI-D is the only interface in wide use on computers that
can drive the native resolution. Complaining about that is rather
like complaining that you can't put a house fire out with a garden
hose hooked up to the big red truck.

Any card with a Dual-link DVI-D connector is going to scale
automatically to the panel's native resolution.

I will admit, however, that having scaling done in the monitor with
aspect ratio preserved would be useful to me, since nVidia's drivers
for the 8800 cards are broken (and have been for a while) with regards
to scaling while keeping the aspect ratio. The scaling done by the
card without driver software is simple stretching.

In another message, you mentioned the inability to simply plug any old
PC into the monitor and see something on the screen. That's as
legitimate a complaint as one about being not able to plug an SATA
drive into any old computer to add storage. Just as you might need to
buy an SATA controller add-on card to use the newer drive type, you
may need to buy a Dual-link DVI-D card to use the newer monitor type.
4. No HMDI either

You don't have a graphics card with HDMI support unless you know you
need one and take steps to get one. So if you're already doing that,
it's no different than seeking one with Dual-link DVI-D support,
except that all modern ones have that anyway.
5. No on-board setup controls

Just what OSD functions do you need with a display that's always
digital and always at the same resolution? There may be a couple that
are useful, but I can't think of any off the top of my head.
Obviously, some people are tolerating this.
And some are not, as the Inq rant mentioned
in the thread confirms. Personally, I wouldn't
buy one, and now that Dell's 3rd try has fixed
these problems, these 30inchers with issues
are apt to vanish from the market.

I assure you, the word "tolerate" doesn't enter into the experience of
owning and using a 30" panel operating at 2560x1600. Except perhaps
wondering how one managed to tolerate a miniscule fuzzy-by-comparison
22" CRT before.
 
R

rjn

Mike Ruskai said:
Dual-link DVI-D is the only interface in wide
use on computers that can drive the native resolution.

No argument there. If you want 2560 on DVI, you
have to use dual-link. That's been obvious since
DVI 1.0, which originally was limited to 1600 (and
later 1920 with CVT).

I was an early adopter of 23in 1920, and being
aware of the DVI limitations, made some effort to
make sure that my card and monitor would actually
work together. Neither brand bothered to specify
that their product was single link. I also had to
figure out just how 1920 was possible on single (CVT).
Complaining about that is rather like complaining
that you can't put a house fire out with a garden
hose hooked up to the big red truck.

Except that's not the compliant. The complaint was:
"Dual link DVI only"

Ignoring the VGA thing ...

If you happened to buy a gen-1 2560 monitor, planning
to upgrade the graphic card to 2560 later, you might
expect to get SOMETHING on screen with your current,
say, Matrox Parhelia. Nope.

You might even have confused dual-port (which almost
all Matrox cards are) with dual-link (which almost no
Matrox cards are). {And Matrox doesn't go out of their
way to let you know that all except the one DL card
are not DL. I had to ask in their now-gone forum.}
Any card with a Dual-link DVI-D connector is going
to scale automatically to the panel's native resolution.

Really, in any OS?

The opportunities for unhappy customers are significant
with the gen-1 2560 LCDs. I'm surprised we haven't seen
more complaints like the Inq rant. I'm guessing that the
reason is that most people getting a 30in LCD are also
buying a whole new PC or Mac, and just happen to be
getting a DL graphics card. I presume that the monitors
include a DL cable.
I will admit, however, that having scaling done in
the monitor with aspect ratio preserved would be
useful to me, ...

I sometimes hook the 1920 23in to a DVD player via
YPbPr, and the analog-in and scaling are essential
there, because I don't yet have a player with DVI
out (but even that would not be 2560 DL - players
with scaling do 1920).
In another message, you mentioned the inability to
simply plug any old PC into the monitor and see
something on the screen. That's as legitimate a
complaint as one about being not able to plug an SATA
drive into any old computer to add storage.

Not really. A more apt analogy would be buying a
printer for an old LPT port PC, and discovering
that it's USB only. THAT bit a lot of people, and
it took years to get the market to the point where
USB-only printers were a safe sell.

The fact that Dell's 3d try at 30in now has analog
inputs suggests that the market is not yet ready for
DVI (and/or HDMI or DisplayPort)-only monitors.
Too many people have other PCs/laptops kicking around
that they might want to hook up, if only temporarily,
to the new LCD, via DVI single or VGA.
Just what OSD functions do you need with a display
that's always digital and always at the same resolution?

Black level. White level. Color temp. Gamma.
Backlight level. On an OS where the hosted applet
controls don't run (Unix, Linux).
I assure you, the word "tolerate" doesn't enter into
the experience of owning and using a 30" panel
operating at 2560x1600.

Yeah, true. You are either in bliss or
in rage (because it doesn't work at all).
 
M

Mike Ruskai

Really, in any OS?

Without an OS. My computer's boot screen shows up with no graphics
driver software, scaled to 2560x1600.

I suppose it's conceivable that a card with a DL DVI-D port could lack
that functionality, but it doesn't seem plausible.
The opportunities for unhappy customers are significant
with the gen-1 2560 LCDs. I'm surprised we haven't seen
more complaints like the Inq rant. I'm guessing that the
reason is that most people getting a 30in LCD are also
buying a whole new PC or Mac, and just happen to be
getting a DL graphics card. I presume that the monitors
include a DL cable.

I suppose I'm of the opinion that if you're buying technological
products on your own, it's your responsibility to make sure they'll
work. I don't drop $1800 on a monitor without making sure I have the
equipment necessary to use it, and I doubt you would either.
Not really. A more apt analogy would be buying a
printer for an old LPT port PC, and discovering
that it's USB only. THAT bit a lot of people, and
it took years to get the market to the point where
USB-only printers were a safe sell.

That is a better analogy. For a while, printers were made with both
parallel ports and USB ports, and scanners were made with both SCSI
and USB interfaces.

But now imagine that the printer (or scanner) has a data rate which
precludes operating over the old interface, and it must have USB 2.0,
or perhaps FireWire (which is actually faster in practice).

Such a printer or scanner would certainly be expensive, and not
something bought casually. Should they build in a crippled mode and
add older interfaces, just in case someone with an old PC worth less
than $100 drops, say $1500 on this new device, and expects it to work?
The fact that Dell's 3d try at 30in now has analog
inputs suggests that the market is not yet ready for
DVI (and/or HDMI or DisplayPort)-only monitors.
Too many people have other PCs/laptops kicking around
that they might want to hook up, if only temporarily,
to the new LCD, via DVI single or VGA.

Sure, someone will spend that amount of money after knowing the PC
supports it, then be disappointed when the laptop doesn't. And that
person would certainly be happier with a model that supports many
different resolutions, so older machines could drive it at some level.

But I don't think such people are numerous. Honestly, how many
complaints about 30" monitors do you see, compared to the incidents of
bragging?
Yeah, true. You are either in bliss or
in rage (because it doesn't work at all).

I guess I just can't build up any sympathy for people who spend $1300+
for a piece of technology without checking prerequisites. Let them
rage against their own shortsightedness.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top