1680x1050 name?

S

sdfisher

Does the 1680x1050 resolution have a name? I'm trying to find out if a
card will support it, but the manufacturer is hung up on it being an
"Apple Cinema 20" display.
 
J

J. Clarke

Does the 1680x1050 resolution have a name? I'm trying to find out if a
card will support it, but the manufacturer is hung up on it being an
"Apple Cinema 20" display.

Doesn't have a standard name, but it's not that high a resolution these
days--most current video boards should have little problem with it,
although you may have to set it using a utility called "Powerstrip".
 
D

DaveW

It does NOT have name. I believe it is the proprietary resolution used by
Apple for their monitor.
 
E

Esben von Buchwald

Does the 1680x1050 resolution have a name? I'm trying to find out if a
card will support it, but the manufacturer is hung up on it being an
"Apple Cinema 20" display.
Yes... WSXGA+

Both used by Dell, HP and others in laptops and stand alone TFT monitors
 
B

Bob Myers

Does the 1680x1050 resolution have a name? I'm trying to find out if a
card will support it, but the manufacturer is hung up on it being an
"Apple Cinema 20" display.

It doesn't really have a name as such (the industry is trying
to get away from the whole increasingly silly "GA"
naming style, anyway), but it's also really not a
"proprietary" format in any real sense as some might
think, although the Apple Cinema display was the first
to use it. Basically, 1680 x 1050 is a "widescreen"
(16:10 aspect ratio) modification of the so-called
"SXGA+" format (1400 x 1050) which first showed up
in various notebook panels. Whether or not it becomes
popular in other desktop monitors is yet to be seen.
It may, since 1920 x 1200 (the 16:10 "wide"
derivative of 1600 x 1200) may be a bit much on a
~20" diagonal screen, and the wide version of
1280 x 1024 (which would be something like
1638 x 1024, but it really doesn't work out nicely)
is just an odd set of numbers.

Bob M.
 
S

sdfisher

J. Clarke said:
Doesn't have a standard name, but it's not that high a resolution these
days--most current video boards should have little problem with it,
although you may have to set it using a utility called "Powerstrip".

No, it's not all that high. I'm really not sure why getting a straight
answer out of video card companies is such a PITA. Here's what I got
out of eVGA:

"Unfortunately we have not specifically tested our products with the
Apple Cinema brand of monitors. As for the max resolution when making
DVI connections, 1600x1200 is the conservative number for the
advertised max resolution. However like every computer hardware device,
the optimal performance will depend on the compatible of each computer
devices working together as a whole."

Completely correct, and 100% useless as answers go. It depends on the
devices working as a whole? Since when? The mother board (which I
included specs for) connects to the video card, the video card connects
to the monitor (which I also included specs for). Everything except the
new video card can drive 1680x1050, because I did it before my old
video card's cooling fan snapped.

The old "You're in a helicopter!" joke sprang to mind.

MSI hasn't sent a reply at all yet. Maybe that means theirs will be
useful.

-- Steve
 
J

J. Clarke

No, it's not all that high. I'm really not sure why getting a straight
answer out of video card companies is such a PITA. Here's what I got
out of eVGA:

"Unfortunately we have not specifically tested our products with the
Apple Cinema brand of monitors. As for the max resolution when making
DVI connections, 1600x1200 is the conservative number for the
advertised max resolution. However like every computer hardware device,
the optimal performance will depend on the compatible of each computer
devices working together as a whole."

That's actually about as straight an answer as the lawyers will allow. It's
basically "we dunno for sure, there's no reason why it shouldn't, but we're
not going to guarantee it".

To give you a better answer the tech would have to go out and scare up an
Apple Cinema display and a copy of Powerstrip and hook it up and try it,
and that's not in his job description and he doesn't have that kind of
budget and if he's away from the phone that long then he's likely to get
fired. There may be another department in the company that does that but
he's not part of that department.

If you can get him to bump you up to third tier you might find someone who
does have the necessary resources or authority to call on another
department who'll test it and get back to you and add it to their test
suite.
Completely correct, and 100% useless as answers go. It depends on the
devices working as a whole? Since when? The mother board (which I
included specs for) connects to the video card, the video card connects
to the monitor (which I also included specs for). Everything except the
new video card can drive 1680x1050, because I did it before my old
video card's cooling fan snapped.

The old "You're in a helicopter!" joke sprang to mind.

MSI hasn't sent a reply at all yet. Maybe that means theirs will be
useful.

It will likely be about the same.
 
S

sdfisher

J. Clarke said:
That's actually about as straight an answer as the lawyers will allow. It's
basically "we dunno for sure, there's no reason why it shouldn't, but we're
not going to guarantee it".

What a sad, sad world we live in. But yes, you've likely nailed it.
Thanks for the insight. :)
 
B

Bob Myers

"Unfortunately we have not specifically tested our products with the
Apple Cinema brand of monitors. As for the max resolution when making
DVI connections, 1600x1200 is the conservative number for the
advertised max resolution. However like every computer hardware device,
the optimal performance will depend on the compatible of each computer
devices working together as a whole."

DVI (and similar digital interfaces, or even analog types, for that
matter) really doesn't have a "max. resolution" (pixel format)
per se. The limiting factor is pretty much always the pixel rate,
and as long as you can put together a valid timing for a given format
which does not exceed the rate limit of the interface in question,
the interface will be happen. For instance, a format of, say,
200 pixels horizontally by 10,000 vertically at 60 Hz refresh may
be a silly thing to do, but since the pixel rate would probably come
in well under 150 MHz it wouldn't pose much of a problem for
most interfaces.

With respect to DVI specifically, the maximum pixel rate
permissible (in the single-pixel-per-clock mode) is 165 MHz;
this is sufficient to cover the VESA standard timing for
1600 x 1200 at 60 Hz (162.000 MHz pixel clock), even at
CRT-like blanking times. Reducing the blanking will increase
the active pixel count supportable quite a bit higher, of course.
However, whether or not a given implementation can actually
support the specified max. rate is not guaranteed; cable and
connector quality issues may limit the max. rate that can be
reliably used to something less than the spec. maximum.

Bob M.
 
N

Not Gimpy Anymore

What a sad, sad world we live in. But yes, you've likely nailed it.
Thanks for the insight. :)

Well, if you can hold on a little bit longer, Samsung is apparently
developing a 21W panel with that very pixel format and so the format
should become a bit better known, albeit still likely without a "name".
Note the latest VESA GTF standard *does* cover that format, and
several of the video card "big boys" are now thought to be working on
drivers for it, in anticipation of increasing demand.
Sorry that it doesn't help your immediate need though.

HTH
NGA
 
S

sdfisher

In the end, I just drove to a local computer shop and bought a ASUS
5700LE card for $150, plugged it in, and it supported 1680x1050 even
though it didn't say it will.

I seem to have taken a small performance hit, which isn't usually the
reason you buy new hardware, but oh well. At least I have a computer
again.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top