Which ATI (or other? better?) card 1) has DVI and 2) will work with an OLDER AGP chip?

M

MISS CHIEVOUS

Benjamin & Frodo . . . you guys are JUST THE GREATEST!! Honestly, I
can't thank you enough for this intelligence! --and it will be an
excellent resource for anyone else who faces this issue.

I'd like to focus this discussion on the "end game": what result is it
exactly that I'd like to achieve if I buy this monitor. In the
best-case scenario, two results would be possible:
- - - - - - - - - - - - - - - - - - - - - - - - -
1. I would turn this monitor 90° and the graphics card driving it
would be capable of IMMEDIATELY self-adjusting to the (10:16) 1200 x
1920 PORTRAIT MODE . . . -->>with NO SOFTWARE INTERVENTION ON MY PART.
I wouldn't need to access the video card's software (control panel) in
any way to instruct the screen to adjust; the hardware __alone__ would
be smart enough to reset automatically.

and (very important!)

2. The video card (and/or monitor itself) would -->>REMEMBER MY
SETTINGS for each of the two respective modes I would be using (16:10
and 10:16). I would not have to reset the display via the graphic
card's control panel each time I physically rotated the monitor; it
would remember the layout and display it automatically.
- - - - - - - - - - - - - - - - - - - - - - - - -

Some of this is beyond the reasonable scope of your ability to advise
(you almost need to have this monitor before you to see if these
features are available or not), so I'm not expecting an answer for this
specific monitor.

"Generally" speaking -- to the extent you know about monitors that
PIVOT (can be physically rotated) -- are these features automatic? or
are they dependent upon the video card that is driving the monitor?
Are they limited by the generation of processor on the motherboard? or
is this an issue of the OS?

As I see it, I have exactly five outcomes to consider:

--------BEST
1. Windows 2000 Pro + ATI Radeon 7200 = Monitor PIVOTS 90°
automatically
--------BETTER
2. Windows 2000 Pro + **NEW AGP GRAPHICS CARD** = Monitor PIVOTS 90°
automatically
--------GOOD
3. Windows XP + ATI Radeon 7200 = Monitor PIVOTS 90° automatically
--------I'M REBUILDING MY COMPUTER
4. Windows XP + **NEW AGP GRAPHICS CARD** = Monitor PIVOTS 90°
automatically
--------. . . and STILL this won't be automatic? = Doubtful I'll buy
this monitor
5. Windows XP + **NEW AGP GRAPHICS CARD** = Monitor PIVOTS 90° with
software intervention by User

My currentl ATI Radeon 7200 has the "Rotate" feature. The question is,
is it automatic?

Gentlemen?

MC
 
B

Benjamin Gawert

* MISS CHIEVOUS:
1. I would turn this monitor 90° and the graphics card driving it
would be capable of IMMEDIATELY self-adjusting to the (10:16) 1200 x
1920 PORTRAIT MODE . . . -->>with NO SOFTWARE INTERVENTION ON MY PART.
I wouldn't need to access the video card's software (control panel) in
any way to instruct the screen to adjust; the hardware __alone__ would
be smart enough to reset automatically.

If I remember right the 2407FPW doesn't have a twisting sensor which
means the computer can't know if you have rotated the display or not. So
you always have to change the setting manually.
and (very important!)

2. The video card (and/or monitor itself) would -->>REMEMBER MY
SETTINGS for each of the two respective modes I would be using (16:10
and 10:16). I would not have to reset the display via the graphic
card's control panel each time I physically rotated the monitor; it
would remember the layout and display it automatically.

If the monitor doesn't tell the computer that it has been rotated this
simply is not possible. However, to set up pivot is only one setting to
change. You don't have to change resolution, just select pivot mode in
the display properties or in the tray utility.
"Generally" speaking -- to the extent you know about monitors that
PIVOT (can be physically rotated) -- are these features automatic?

On most monitors no. There are a few models that detect the display
rotation and tell the computer to change settings but these usually are
very expensive professional models.
or
are they dependent upon the video card that is driving the monitor?

If pivot id done automatically id dependent on the monitor. If pivot can
be done at all is dependent on the gfx card (every Radeon or Geforce
does fine).
Are they limited by the generation of processor on the motherboard?
No.

or
is this an issue of the OS?

No, but it also is an issue of the driver. Windows2k/XP/Vista usually is
fine.
My currentl ATI Radeon 7200 has the "Rotate" feature. The question is,
is it automatic?

Automatic depends on the monitor not on the gfx card.

Benjamin
 
B

Benjamin Gawert

* Frodo:

Just be aware that some card makers didn't code the AGP connector for
1x/2x any more to avoid to have to support these cards in old systems.

Benjamin
 
M

MISS CHIEVOUS

Benjamin Gawert wrote:
~~~~~CAPPED and edited for Quick Reference + Summary~~~~~
If I remember right the 2407FPW doesn't have a TWISTING SENSOR
which means the computer can't know if you have rotated the display or
not. So you always have to change the setting manually.

IF THE MONITOR DOESN'T TELL THE COMPUTER THAT IT HAS BEEN
ROTATED [it will NOT BE AUTOMATIC]. However, to set up pivot is only
one setting to change. You don't have to change resolution, just select
pivot mode in the display properties or in the tray utility.

On most monitors [the PIVOT feature is NOT automatic]. There are a few
models that detect the display rotation and tell the computer to change
settings but these usually are EXPENSIVE PROFESSIONAL MODELS.

If pivot can be done automatically, IT IS DEPENDENT ON THE MONITOR.
If pivot can be done at all is dependent on the graphics card
(every Radeon or Geforce does fine).

[They are not limited by . . .]
the generation of processor on the motherboard [or]
the OS; but can be an issue of the driver. Windows2k/XP/Vista usually is
fine.

Automatic depends on the monitor not on the graphics card.

Once again Benjamin, this is very helpful and I can't thank you enough.

This intelligence informs 4 significant purchasing decisions:
- - - - - - - - - - - - - - - - - - - - - - - - -
1. IT IS UNNECESSARY FOR ME TO UPGRADE MY VIDEO CARD:
The ATI Radeon "ROTATE" feature is virtually the identical feature I
would see on even the most expensive, state-of-the-art ATI or GEFORCE
card.
- - - - - - - - - - - - - - - - - - - - - - - - -
2. IT IS UNNECESSARY FOR ME TO UPGRADE MY OS:
PORTRAIT MODE capability is the identical capability I would see on an
XP (or Vista) OS interface.
- - - - - - - - - - - - - - - - - - - - - - - - -
3. IT IS UNNECESSARY FOR ME TO UPGRADE MY MOTHERBOARD:
The graphics slot on the motherboard does not interface with the
monitor to enable its "ROTATE" (or "PIVOT") mode -- only the graphics
card + monitor itself control this feature.
- - - - - - - - - - - - - - - - - - - - - - - - -
and finally,
4. Unless I want to buy a REALLY EXPENSIVE PROFESSIONAL-GRADE MONITOR
.. . .
I'll have to MANUALLY CHANGE the monitor settings, through my video
card's control panel, each time I want to rotate the monitor.
- - - - - - - - - - - - - - - - - - - - - - - - -

It looks like the purchase decision is going to turn on whether I want
to right-click my ATI card each time I rotate the monitor.

Do you by any chance happen to know Benjamin: are the display settings
at least SAVED so that each time I right-click to adjust the mode I
don't __also__ have to completely reconfigure my desktop? I have
numerous shortcuts displayed on my desktop, and I am wondering if those
icons will be incapable of remembering their respective positions.

Thanks again guys. This is an incredible education in new monitor
technology!

MC
 
M

MISS CHIEVOUS

Oh wait, I need to clarify one other factor:

- - - - - - - - - - - - - - - - - - - - - - - - -
5. IT IS UNNECESSARY FOR ME TO HAVE A DVI PORT ON MY VIDEO CARD:
With earlier monitors this may have been useful; but there is now no
significant difference in image quality between VGA and DVI.
- - - - - - - - - - - - - - - - - - - - - - - - -

My only question here is: Does this hold true even with my older ATI
Radeon 7200 AGP card, as well? In other words, is picture quality __in
fact__ driven by the monitor . . . and I'll have the same quality with
this new 24" Widescreen using a standard VGA cable with my older AGP
card?

Thanks!

MC
 
R

Ray S

MISS said:
Benjamin Gawert wrote:
~~~~~CAPPED and edited for Quick Reference + Summary~~~~~
If I remember right the 2407FPW doesn't have a TWISTING SENSOR
which means the computer can't know if you have rotated the display or
not. So you always have to change the setting manually.

IF THE MONITOR DOESN'T TELL THE COMPUTER THAT IT HAS BEEN
ROTATED [it will NOT BE AUTOMATIC]. However, to set up pivot is only
one setting to change. You don't have to change resolution, just select
pivot mode in the display properties or in the tray utility.

On most monitors [the PIVOT feature is NOT automatic]. There are a few
models that detect the display rotation and tell the computer to change
settings but these usually are EXPENSIVE PROFESSIONAL MODELS.

If pivot can be done automatically, IT IS DEPENDENT ON THE MONITOR.
If pivot can be done at all is dependent on the graphics card
(every Radeon or Geforce does fine).

[They are not limited by . . .]
the generation of processor on the motherboard [or]
the OS; but can be an issue of the driver. Windows2k/XP/Vista usually is
fine.

Automatic depends on the monitor not on the graphics card.

Once again Benjamin, this is very helpful and I can't thank you enough.

This intelligence informs 4 significant purchasing decisions:
- - - - - - - - - - - - - - - - - - - - - - - - -
1. IT IS UNNECESSARY FOR ME TO UPGRADE MY VIDEO CARD:
The ATI Radeon "ROTATE" feature is virtually the identical feature I
would see on even the most expensive, state-of-the-art ATI or GEFORCE
card.
- - - - - - - - - - - - - - - - - - - - - - - - -
2. IT IS UNNECESSARY FOR ME TO UPGRADE MY OS:
PORTRAIT MODE capability is the identical capability I would see on an
XP (or Vista) OS interface.
- - - - - - - - - - - - - - - - - - - - - - - - -
3. IT IS UNNECESSARY FOR ME TO UPGRADE MY MOTHERBOARD:
The graphics slot on the motherboard does not interface with the
monitor to enable its "ROTATE" (or "PIVOT") mode -- only the graphics
card + monitor itself control this feature.
- - - - - - - - - - - - - - - - - - - - - - - - -
and finally,
4. Unless I want to buy a REALLY EXPENSIVE PROFESSIONAL-GRADE MONITOR
. . .
I'll have to MANUALLY CHANGE the monitor settings, through my video
card's control panel, each time I want to rotate the monitor.
- - - - - - - - - - - - - - - - - - - - - - - - -

It looks like the purchase decision is going to turn on whether I want
to right-click my ATI card each time I rotate the monitor.

Do you by any chance happen to know Benjamin: are the display settings
at least SAVED so that each time I right-click to adjust the mode I
don't __also__ have to completely reconfigure my desktop? I have
numerous shortcuts displayed on my desktop, and I am wondering if those
icons will be incapable of remembering their respective positions.

Thanks again guys. This is an incredible education in new monitor
technology!

MC

Speaking off the cuff here, but I do know that even my old Matrox G450
alows me to create custom configurations that I can switch to. Never
used them, but it might even be true that you can assign hot keys to
them. Worth checking the manual for any specific card your thinking of
buying. You can usually d/l them from the manufacturer site.
 
F

Frodo

What's the make and model of your current CRT (monitor)?

Picture quality on VGA is affected by the DAC chips (Digital to Analog
Converters) on the graphics card that take the digital information
provided by the GPU (graphics processing unit) and changed to an analog
signal that is sent to the VGA monitor.

Each company that makes graphics cards decide on how much money they are
willing to send on DAC.
The more money spent, the better the DAC.
DAC are a small cost for the graphics cards, but companies will try so save
a few dollars using poor quality Ramdacs.
Matrox uses really good Ramdacs, ATI based cards are a close second.
In the past Nvidia let card manufacturers use what ever quality they wanted.
But Nvidia started pushing card makers the use better Ramdacs starting
around the time the 5200 came out.

With DVI, there is no need for DAC, the digital signal sent straight to the
LCD.

My Hitachi CM 810 21" CRT (VGA) monitor had a comparable quality as a
Viewsonic 922 19" (DVI ) LCD.
 
B

Benjamin Gawert

* MISS CHIEVOUS:
Oh wait, I need to clarify one other factor:

- - - - - - - - - - - - - - - - - - - - - - - - -
5. IT IS UNNECESSARY FOR ME TO HAVE A DVI PORT ON MY VIDEO CARD:
With earlier monitors this may have been useful; but there is now no
significant difference in image quality between VGA and DVI.
- - - - - - - - - - - - - - - - - - - - - - - - -

This is wrong.
My only question here is: Does this hold true even with my older ATI
Radeon 7200 AGP card, as well? In other words, is picture quality __in
fact__ driven by the monitor

No. Whoever says that is telling you BS. Image quality is a product of
gfx card, video cable and monitor. With VGA the gfx card has to convert
the digital image into analog signals, and the monitor again converts
the analog signals into digital data. Besides that every conversion
affects image quality analog signals are much more sensible to
disturbances than digital signals. How much quality is lost with VGA
depends not only on the monitor but also on the video cable and the gfx
card. Of course not always the quality loss is visible, maybe because
the conditions are good (gfx card provides good signal quality, video
cable is good, no disturbances etc), because the monitor itself has an
average image at best (especially cheap noname monitors), because the
monitor is small (where differences are hardly noticeable), because the
resolution is low (1280x1024 and below), and sometimes simply because
the person in front of the monitor wouldn't even notice the difference
if it bites him into his arse.
. . . and I'll have the same quality with
this new 24" Widescreen using a standard VGA cable with my older AGP
card?

Let me tell you something: I have a Dell 2005FPW (20" widescreen) and
also had the 2405FPW for some time. I have several computers with gfx
cards ranging from a cheap old Geforce2MX up to two 1700USD a piece
Quadro FX4500 professional gfx boards. I can see a difference between
VGA and DVI even on the 20" monitor with all gfx cards. With some cards
the difference is hardly noticeable, with other cards it's more
noticeable. But it's noticeable - always.

With smaller monitors and/or lower resolutions the difference can be
small enough to be unnoticeable, though. But you are looking at a 24"
display with 1920x1200 resolution, don't forget that.

And besides the image quality factor, DVI also has the advantage that
there is no image centering or adjusting necessary. The image is always
centered and clear and sharp, with VGA you often have to make adjustments.

IMHO using a 24" TFT with 1920x1200 without DVI would be silly.

Benjamin
 
B

Benjamin Gawert

* Frodo:
Picture quality on VGA is affected by the DAC chips

Nope. "DAC chips" (correctly called "RAMDAC") have been integrated into
the graphics processors since RivaTNT/Rage Pro times, so for around a
decade now. All RAMDACs in gfx cards of the last 8 years or so have a
very high bandwidth (at least 320MHz, today 400+MHz is standard) and
provide excellent signals.

However, the difference in image quality doesn't come from the RAMDAC
but from the output filters. Some card manufacturers tend to save a few
cents by using cheap filters that allow them to fullfill EMI standards
but which also limit the bandwidth. This results in a degradation of the
signal quality and thus also the image quality.
With DVI, there is no need for DAC, the digital signal sent straight to the
LCD.

Right. That's the main reason why DVI provides a better image quality.

Benjamin
 
M

MISS CHIEVOUS

Benjamin said:
Let me tell you something: I have a Dell 2005FPW (20" widescreen) and
also had the 2405FPW for some time. I have several computers with gfx
cards ranging from a cheap old Geforce2MX up to two 1700USD a piece
Quadro FX4500 professional gfx boards. I can see a difference between
VGA and DVI even on the 20" monitor with all gfx cards. With some cards
the difference is hardly noticeable, with other cards it's more
noticeable. But it's noticeable - always.

With smaller monitors and/or lower resolutions the difference can be
small enough to be unnoticeable, though. But you are looking at a 24"
display with 1920x1200 resolution, don't forget that.

And besides the image quality factor, DVI also has the advantage that
there is no image centering or adjusting necessary. The image is always
centered and clear and sharp, with VGA you often have to make adjustments.

IMHO using a 24" TFT with 1920x1200 without DVI would be silly.

Ah! Okay, so I __would__ need a new graphics card with DVI. Got it.

I'll have to think about what I want to do here. Unless the card can
remember my desktop (or rather more to the point, the layout of my
shortcuts) this really is going to oblige a huge change in the way I
work. I'm so used to just having my shortcut icons right there at all
times.

hmmmm.

MC
 
M

MISS CHIEVOUS

What should I make of Yousuf's post, here?

Yousuf Khan wrote:
Yeah, DVI doesn't improve your picture one iota. About the only
advantage I've seen from it is that with an lcd monitor, it allows you
to scale the non-native resolutions a little better, closer to a CRT
monitor's scaling.
 
B

Benjamin Gawert

* MISS CHIEVOUS:
What should I make of Yousuf's post, here?

Don't know. It's up to you. He even didn't write what setup (gfx card,
monitor) he has and where he didn't notice any difference. My experience
tells me otherwise.

Benjamin
 
F

Frodo

Output filters, sound right.

Benjamin Gawert said:
* Frodo:


Nope. "DAC chips" (correctly called "RAMDAC") have been integrated into
the graphics processors since RivaTNT/Rage Pro times, so for around a
decade now. All RAMDACs in gfx cards of the last 8 years or so have a very
high bandwidth (at least 320MHz, today 400+MHz is standard) and provide
excellent signals.

However, the difference in image quality doesn't come from the RAMDAC but
from the output filters. Some card manufacturers tend to save a few cents
by using cheap filters that allow them to fullfill EMI standards but which
also limit the bandwidth. This results in a degradation of the signal
quality and thus also the image quality.


Right. That's the main reason why DVI provides a better image quality.

Benjamin
 
B

Bob Myers

I realize this is showing up as a reply to the wrong posting, but I missed
the
following comments when they were first posted, and want to correct a
couple of misconceptions:

Actually, "DAC" (digital to analog converter) is correct; a "RAMDAC"
was simply a DAC chip which included the color look-up tables
("color map" memory, as RAM), before BOTH functions were
integrated into the graphics chips. There's no sense in keeping the
term "RAMDAC" around at all any more, since the two are completely
separate functions.


This is a common misconception, but it IS a misconception. The
main reason that any "digital" interface provides improved image
quality with LCDs or other fixed-format displays is that such interfaces
provide an explicit pixel clock, so that the data can always be properly
mapped to the physical pixels of the screen. Analog interfaces such
as the "VGA" connector do not provide such information, and instead
the sampling clock has to be derived from other timing information
(typically the horizontal sync signal) provided by the interface.
Creating a sampling clock in this manner, though, can lead to some
errors (and it's why many analog-input LCD monitors include controls
which permit the user to fine-tune the sampling clock frequency and
phase).

The notion that avoiding a digital-to-analog conversion is responsible
for whatever quality improvement occurs comes from the common
(but also mistaken) notion that LCDs are themselves somehow
"digital." Fundamentally, though, the LCD is an analog-drive
device, and a digital-to-analog conversion occurs within the LCD
panel, at the drivers. In fact, LCDs have been made which preserve
an analog video input all the way through to the pixel level - these were
sometimes used back when analog monitor interfaces were all there were.

Bob M.
 
B

Benjamin Gawert

* Bob Myers:
Actually, "DAC" (digital to analog converter) is correct; a "RAMDAC"
was simply a DAC chip which included the color look-up tables
("color map" memory, as RAM), before BOTH functions were
integrated into the graphics chips. There's no sense in keeping the
term "RAMDAC" around at all any more, since the two are completely
separate functions.

It makes sense because the CLUT table is still there. Most modern GPUs
work with a fixed color depth (usually 24bit, some also with 32bit). The
CLUT is still needed and used when modes with lower color depth (16bit,
12bit, 8bit) are used.
This is a common misconception, but it IS a misconception.

No, it isn't.
The
main reason that any "digital" interface provides improved image
quality with LCDs or other fixed-format displays is that such interfaces
provide an explicit pixel clock, so that the data can always be properly
mapped to the physical pixels of the screen. Analog interfaces such
as the "VGA" connector do not provide such information, and instead
the sampling clock has to be derived from other timing information
(typically the horizontal sync signal) provided by the interface.
Creating a sampling clock in this manner, though, can lead to some
errors (and it's why many analog-input LCD monitors include controls
which permit the user to fine-tune the sampling clock frequency and
phase).

This is of course correct, but doesn't change a yota to the fact that
the main reason for the improved image quality of digital connections
via DVI simply is the absence of A/D- and D/A conversion, and the fact
while with analog video signals the image quality is directly
proportional to the signal degradation of the transmission line with
DVIs digital TMDS signalling the image quality remains constant until
degradation reaches a certain point.
The notion that avoiding a digital-to-analog conversion is responsible
for whatever quality improvement occurs comes from the common
(but also mistaken) notion that LCDs are themselves somehow
"digital." Fundamentally, though, the LCD is an analog-drive
device, and a digital-to-analog conversion occurs within the LCD
panel, at the drivers. In fact, LCDs have been made which preserve
an analog video input all the way through to the pixel level - these were
sometimes used back when analog monitor interfaces were all there were.

LCDs are pixel-mapped devices with fixed resolution while CRTs can be
pixel-mapped, line-mapped (i.e. TVs) or vector-mapped (i.e. data
displays in most aircrafts). It simply doesn't matter that the LCD
pixels are driven analog. The fact that the digital signals provide an
image that is "pixel-matching" while the resulting signal after
conversion into an anlog signal doesn't is one of the main factors
invfluencing the image quality the user notices.

Benjamin
 
B

Bob Myers

Benjamin Gawert said:
No, it isn't.

Yes, it is. How long shall we keep this up? :)
This is of course correct, but doesn't change a yota to the fact that the
main reason for the improved image quality of digital connections via DVI
simply is the absence of A/D- and D/A conversion, and the fact while with
analog video signals the image quality is directly proportional to the
signal degradation of the transmission line with DVIs digital TMDS
signalling the image quality remains constant until degradation reaches a
certain point.

Not at all. This is far from the MAIN reason for improved
image quality, as evidenced by the large number of current
analog-interfaced LCD monitors (and other fixed-format display
devices which provide analog inputs) in which the resulting image
quality is indistinguishable from the same situation but with a
"digital" input. The impact of instantaneous noise in the analog
channel on the quality of the resulting image is generally negligible,
unless it gets REALLY noisy - owing to the fact that the display and
the eye will average out noise-induced errors over multiple frames.
The classic problem with analog-connected LCDs, etc., has
always been instability in the video data with respect to the display
panel's physical pixels - or, in other words, incorrect and/or unstable
sampling. The digital interface in this case has the distinct advantage
of providing unambiguous pixel-level timing information, and so
avoids such problems. (For a fairer comparison, try disconnecting
the clock signal in a digital interface - and then try to regenerate
THAT clock from, say the horizontal sync signal, which is exactly
what analog inputs on LCDs are doing. Let me know how well
that works out for you...:))
LCDs are pixel-mapped devices with fixed resolution while CRTs can be
pixel-mapped, line-mapped (i.e. TVs) or vector-mapped (i.e. data displays
in most aircrafts). It simply doesn't matter that the LCD pixels are
driven analog. The fact that the digital signals provide an image that is
"pixel-matching" while the resulting signal after conversion into an anlog
signal doesn't is one of the main factors invfluencing the image quality
the user notices.

If by the above, you mean LCDs are fixed-format (i.e., possessing
a fixed physical array of pixels), that's precisely what I said above.
However, CRTs have NEVER been "pixel mapped," in that there
has never been a CRT-based display made (well, with the exception
of some extremely low-volume, niche-market designs) where the
"pixels" of the video data were in any way constrained to map to any
physical structures (the phosphor dot triads, say) of the screen.
Pixels as such simply don't exist for the CRT in the sense meant in
this discussion.


Bob M.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top