24" full HD LCD monitor and videocard ATI Radeon x1650 Pro DDR3 or DDR2? Very urgent!.!

U

ulixi

I'm sorry for the urgency but tomorrow morning I have to clear my
ideas.
In your opinion can I use the videocard ATI Radeon x1650 Pro DDR3 or
DDR2 with a 24" full HD LCD monitor ? My monitor is high resolution
LCD 24". It is Samsung SyncMaster 244T
Will the monitor lost the image quality?
Have I to purchase a videocard with better feature?
Thanks
 
I

Icky Thwacket

I'm sorry for the urgency but tomorrow morning I have to clear my
ideas.
In your opinion can I use the videocard ATI Radeon x1650 Pro DDR3 or
DDR2 with a 24" full HD LCD monitor ? My monitor is high resolution
LCD 24". It is Samsung SyncMaster 244T
Will the monitor lost the image quality?
Have I to purchase a videocard with better feature?
Thanks


I have a 244T too - good choice. Monitor native res. is 1920x1200 @60Hz
which the 1650 supports - however whether you will get decent frame rates at
that resolution with current games is debateable.

If you are going to upgrade would recommend a nVidia 8800GTS or GTX.

As you have a top end expensive display it is worth buying a decent gfx card
to go with it.

Icky
 
K

kony

I have a 244T too - good choice. Monitor native res. is 1920x1200 @60Hz
which the 1650 supports - however whether you will get decent frame rates at
that resolution with current games is debateable.

If you are going to upgrade would recommend a nVidia 8800GTS or GTX.

As you have a top end expensive display it is worth buying a decent gfx card
to go with it.


Maybe for gaming but not otherwise. X1650 Pro supports
beyond that resolution and Dual Link DVI. Main thing is,
don't use analog output to drive that resolution.
 
U

ulixi

Thanks for your reply. I don't use my PC for gaminng but I make
videoediting HDV and I have to purchas hte Ati card only.
 
P

Paul

I'm sorry for the urgency but tomorrow morning I have to clear my
ideas.
In your opinion can I use the videocard ATI Radeon x1650 Pro DDR3 or
DDR2 with a 24" full HD LCD monitor ? My monitor is high resolution
LCD 24". It is Samsung SyncMaster 244T
Will the monitor lost the image quality?
Have I to purchase a videocard with better feature?
Thanks

The X1650 Pro has one DVI connector and it is a dual-link DVI.
That means it can handle extended resolutions.

http://ati.amd.com/products/radeonx1650/radeonx1650pro/specs.html

Some info on DVI here.

http://en.wikipedia.org/wiki/Dvi

According to this, your Syncmaster 244T is 1920 by 1200 resolution.
One thing I see mentioned about the 244T, is there is roughly 50 millisecond
latency, between moving your mouse, and seeing the pointer on the
screen move. That seems to annoy some people. Otherwise, a good monitor.

http://www.hometheatermag.com/lcds/506samsung/

The 1650pro card seems to be a fine choice for video editing.

Now, one issue you might want to consider, is HDCP. HDCP is an
encryption scheme for protecting commercial HD content, as it is
displayed on the monitor. Since you are buying a monitor that claims
to support HDCP, you should buy a video card that also supports HDCP.
In the case of the X1650Pro, so far, it looks like to get HDCP support,
you need to buy a card with an HDMI connector on the faceplate of the
card.

http://en.wikipedia.org/wiki/HDCP

Potential manufacturers of X1650Pro - note that some of the partners on
this page, don't know how to run a web site - which means their products
are immediately ignored:

http://support.ati.com/ics/support/default.asp?deptID=894&task=knowledge&folderID=588

This one supports HDCP, has an HDMI connector, and an HDMI to DVI dongle adapter:
http://www.gecube.com/products-detail.php?prod_cat_pid=152&prod_cat_id=192&prod_id=62378

The previous generation of Gecube card, came with DVI connectors, and a DVI to HDMI
dongle (black in color). The 1650Pro from Gecube, switches this around, and puts
an HDMI connector on the faceplate, then you plug an HDMI to DVI adapter, to connect
a DVI cable to a DVI monitor. I guess it works.

Previous generation GeCube X1600 Pro HDMI (GC-HV16PG2-D3). In my opinion, this
makes better sense with regard to connector selection on the faceplate. A DVI to
HDMI dongle is a more rugged and secure design, than the alternative, which is
an HDMI on the faceplate, and an HDMI to DVI dongle.

http://pclab.pl/art22801-8.html

This one supports HDCP, has an HDMI connector, and an HDMI to DVI dongle adapter.
It isn't even listed on the www.visiontek.com web site. So we have to guess at
the specs, in terms of verifying them on the VisionTek web site.

VisionTek 900118 Radeon X1650PRO 256MB 128-bit PCI-E x16 HDCP HDMI Video Card with L-P Bracket
http://www.newegg.com/Product/Product.aspx?Item=N82E16814129076

There are a couple extra chips near the faceplate of the card. But I cannot guess
why such chips should be needed. I thought HDCP was supported by the GPU, and
all that was required, was a flash storage device containing keys and the like.

VisionTek 900118 front view:
http://images10.newegg.com/NeweggImage/productimage/14-129-076-03.jpg

VisionTek 900118 back view:
http://images10.newegg.com/NeweggImage/productimage/14-129-076-04.jpg

VisionTek 900118 faceplate, showing HDMI on the bottom:
http://images10.newegg.com/NeweggImage/productimage/14-129-076-02.jpg

VisionTek 900118 HDMI to DVI dongle:
http://images10.newegg.com/NeweggImage/productimage/14-129-076-05.jpg

Tell us how it works out,
Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top