Graphics Card Recommendations

J

Jabba

I currently have the following system;

AMDBarton 2500+
512 MB RAM
NVIDIA GeForce3 Ti 200
ASUS A7V8X-MX Motherboard

.... but I'm think about upgrading the graphics card to something a little
better, but around the £100 mark - maybe a Radeon 9600XT or a GeForce FX
5700?

Any thoughts?
 
R

Roy Coorne

Jabba said:
I currently have the following system;

AMDBarton 2500+
512 MB RAM
NVIDIA GeForce3 Ti 200
ASUS A7V8X-MX Motherboard

... but I'm think about upgrading the graphics card to something a little
better, but around the £100 mark - maybe a Radeon 9600XT or a GeForce FX
5700?

Any thoughts?


Look at the resource requirements / recommendations for your fabourite
game(s) and choose...
....or save your money for the next PC generation (PCI-E, DDR2, 64 Bit,
BTX...).

Roy
 
J

Jabba

Cuzman said:
" ...I'm think about upgrading the graphics card to something a little
better, but around the £100 mark "


Forget the 9600 XT and the FX 5700. Spend a few more pennies and get a
*Built by ATI* Radeon 9800 Pro for £136.89 http://snipurl.com/6ld6

Not really sure I need this card ... my current system runs Far Cry
relatively well. I'm still looking at the sub £100 mark.
 
K

kony

Not really sure I need this card ... my current system runs Far Cry
relatively well. I'm still looking at the sub £100 mark.

Keep in mind that the less you spend, the less point there is to upgrading
at all... GF3TI cards aren't THAT slow. Sure, they're quite slow compared
to good newer cards, but that first £40 spent won't get you above the bar
of the GF3, especially if you overclock the GF3... I had one that would
run slightly faster than stock speed of a GF3Ti500, which is approaching
the performance level of a Radeon 9600(SE?) except in DX9 games. Perhaps
DX9 is the key here, if you want/need DX9 support.

There are a ton of benchmarks out there, if you want the best bang for
your buck just keep an eye out for sales, you're needing a great deal,
special price to get a good upgrade for that budget.
 
J

Jabba

kony said:
Keep in mind that the less you spend, the less point there is to upgrading
at all... GF3TI cards aren't THAT slow. Sure, they're quite slow compared
to good newer cards, but that first £40 spent won't get you above the bar
of the GF3, especially if you overclock the GF3... I had one that would
run slightly faster than stock speed of a GF3Ti500, which is approaching
the performance level of a Radeon 9600(SE?) except in DX9 games. Perhaps
DX9 is the key here, if you want/need DX9 support.

I think that's what I wondering. My card is running at 219MHz/477Mhz.
 
K

kony

I think that's what I wondering. My card is running at 219MHz/477Mhz.

_IF_ your card has a reasonable heatsink on GPU and memory you may be able
to run card at (roughly) 250MHz core and 540MHz memory... lesser 'sink on
either component would of course limit the clock speed on that particular
component.

To start out you may be able to o'c with "coolbits" (Google search will
find it), though it it's limiting the max o'c by the range limit in
display properties then you may need do a registry edit or flash a
modified bios to the card, which starts the card at a higher clock speed
(one that is tested stable with coolbits tweak setting and benchmarks,
games, etc) and with that higher clock speed it should cause coolbits
tweak to show a higher range. Not sure I"m making it clear, but the
coolbits max o'c limit with it's slider is detemined by the inital clock
speed specified by the card's bios... the higher the initial clock speed
(within certain ranges) the higher coolbits will allow the card to be set.

Keep in mind that I guarantee nothing about adequate airflow, etc, the
ability of your system to accomodate an o'c card.
 
B

Big Mac

Jabba said:
I currently have the following system;
AMDBarton 2500+ >512 MB RAM
NVIDIA GeForce3 Ti 200
ASUS A7V8X-MX Motherboard
... but I'm think about upgrading the graphics card to something a little
better, but around the £100 mark - maybe a Radeon 9600XT or a GeForce FX
5700? >Any thoughts?

I am just curious about this. AMD & Athalon both are made by AMD.
When you say AMD 2500+, is that indicative of the same speed as
something like an Athalon 2500? I need to ask, because my Athalon is
a 3200+. & you'd logically think that something called a 3200 would
be running at 3200 MHz or 3.2 GHz.. At least before I bought an
Athalon that's what I thought. Of course I soon realized it was too
cheap to be a 3.2 GHz machine.

So, is your AMD running at 2.5 GHz? Or is it slower than that? If
the 2500 is a misleading number like with the Athalon line, then I
don't think a higher end video card will be able to use it's
capabilities very much on your older technology machine (assuming it
is older technology).

Big Mac

P.S. Damn AMD for their misleading model numbers. I'll bet they've
caught a few people by surprise.

Big Mac
 
J

Jabba

Big Mac said:
I am just curious about this. AMD & Athalon both are made by AMD.
When you say AMD 2500+, is that indicative of the same speed as
something like an Athalon 2500? I need to ask, because my Athalon is
a 3200+. & you'd logically think that something called a 3200 would
be running at 3200 MHz or 3.2 GHz.. At least before I bought an
Athalon that's what I thought. Of course I soon realized it was too
cheap to be a 3.2 GHz machine.

So, is your AMD running at 2.5 GHz? Or is it slower than that? If
the 2500 is a misleading number like with the Athalon line, then I
don't think a higher end video card will be able to use it's
capabilities very much on your older technology machine (assuming it
is older technology).

The CPU is running at 1.833 GHz which (according to AMD ) is much the same
performance as a 3.2GHz Intel part. It wouldn't describe the AMD Barton
2500+ as old technology, no longer cutting edge but hardly old. I've always
taken the view they you get the best value for money by going to '6 months+'
old technology and not the brand new stuff. This is why I was looking for a
graphics card around the £100 mark.
 
K

kony

I am just curious about this. AMD & Athalon both are made by AMD.
When you say AMD 2500+, is that indicative of the same speed as
something like an Athalon 2500? I need to ask, because my Athalon is
a 3200+. & you'd logically think that something called a 3200 would
be running at 3200 MHz or 3.2 GHz.. At least before I bought an
Athalon that's what I thought. Of course I soon realized it was too
cheap to be a 3.2 GHz machine.

Naw, it's cheaper because AMD doesn't charge a premium for their CPUs like
Intel does excpet for the few highest speeds... the XP2500 is median
speed, so a reasonable price. They aren't actual GHz because it was
necessary to find a way to compare to Intel's chips, which have lower
performance per MHz. but higher MHz operating speeds.

So, is your AMD running at 2.5 GHz? Or is it slower than that? If
the 2500 is a misleading number like with the Athalon line, then I
don't think a higher end video card will be able to use it's
capabilities very much on your older technology machine (assuming it
is older technology).

An XP2500 is fast enough to benefit a lot from a faster card. For example
a Radeon 9800 should easily be over twice as fast at DX9 games.

P.S. Damn AMD for their misleading model numbers. I'll bet they've
caught a few people by surprise.

Why damn AMD? Intel will be doing it too, soon.
 
J

Jabba

Jabba said:
The CPU is running at 1.833 GHz which (according to AMD ) is much the same
performance as a 3.2GHz Intel part. It wouldn't describe the AMD Barton

correction - I meant 2.5GHz(ish) not 3.2GHz
 
G

General Schvantzkoph

I currently have the following system;

AMDBarton 2500+
512 MB RAM
NVIDIA GeForce3 Ti 200
ASUS A7V8X-MX Motherboard

... but I'm think about upgrading the graphics card to something a little
better, but around the £100 mark - maybe a Radeon 9600XT or a GeForce FX
5700?

Any thoughts?

A card with an Nvidia 5900XT is what you want. I just got one for $175
which is in your £100 neighborhood. According to all the review sites the
5900XT is a lot faster than the 5700 for about the same price. According
to the review sites the reason that the 5900XT is such a good deal is that
Nvidia is dumping their remaining stock of 5900 parts because they've
introduced the new 6800 series (those are much faster but they are also
much more expensive, > $500).
 
J

Jabba

kony said:
_IF_ your card has a reasonable heatsink on GPU and memory you may be able
to run card at (roughly) 250MHz core and 540MHz memory... lesser 'sink on
either component would of course limit the clock speed on that particular
component.

To start out you may be able to o'c with "coolbits" (Google search will
find it), though it it's limiting the max o'c by the range limit in
display properties then you may need do a registry edit or flash a
modified bios to the card, which starts the card at a higher clock speed
(one that is tested stable with coolbits tweak setting and benchmarks,
games, etc) and with that higher clock speed it should cause coolbits
tweak to show a higher range. Not sure I"m making it clear, but the
coolbits max o'c limit with it's slider is detemined by the inital clock
speed specified by the card's bios... the higher the initial clock speed
(within certain ranges) the higher coolbits will allow the card to be set.

Keep in mind that I guarantee nothing about adequate airflow, etc, the
ability of your system to accomodate an o'c card.

Any idea what clock speed I would resonable be able to set the card to. It's
a Gainward GeForce3 Ti 200 'Gold Standard'?
 
B

Big Mac

correction - I meant 2.5GHz(ish) not 3.2GHz

Well. I got my Athalon 3200+ knowing it was a 2.2 Ghz processor, &
didn't know it was supposed to be equal to a 3.2 GHz P4.

I really do not think it is equal to said P4, and is doing it's thing
at a 2.2 GHz pace. Perhaps there is a certain kind of process it
excels at. I am quite happy with it, and got a good deal on it.

Big Mac
 
N

Noozer

Big Mac said:
Well. I got my Athalon 3200+ knowing it was a 2.2 Ghz processor, &
didn't know it was supposed to be equal to a 3.2 GHz P4.

It's not..

A 3200+ has the same performance that a 3200Mhz AMD TBird CPU would have had
if they made them that fast.
 
K

kony

Well. I got my Athalon 3200+ knowing it was a 2.2 Ghz processor, &
didn't know it was supposed to be equal to a 3.2 GHz P4.

I really do not think it is equal to said P4, and is doing it's thing
at a 2.2 GHz pace. Perhaps there is a certain kind of process it
excels at. I am quite happy with it, and got a good deal on it.

Big Mac

When AMD started their XP rating system, it was a more accurate
representation of CPU performance relative to a P4. Things changed
though, and the P4 got more cache, hyperthreading, higher FSB, so their
performance remained somewhat more linear to clockspeed increases while
with AMD they had started with a slower FSB so when they increased FSB and
cache, their XP numbering system seemed to rise disproportionately. In
other words, an XP3200 is not 3200/1800 as fast as an XP1800.

As for optimizations to get the projected performance, it's somewhat the
reverse of your theory. In hard number cruncing of unoptimized code, AMD
has the faster chip per XP rating, but it's when the most modern apps are
optimized for SSE2 than a P4 shows it's worth. If you're not running
newer software then the odds are in the Athlon's favor, but then so many
of the more common tasks aren't even CPU-bound, the median priced CPU
combined with higher-end video card or hard drive may be the best
combination for many people.

The XP numbering system is at least valid to the extent that your idea of
a 2.2GHz Athlon "doing things at a 2.2GHz pace", is meaningless, because
it is without question much faster on average than a 2.2GHz P4 except
"sometimes" at video editing apps optimized for SSE2. Some tasks benefit
more from one architecture than the other, but this has always been the
case.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top