Quick nForce/GeForce question.

M

~misfit~

My g/f is upgrading from an old Celly 900/GeForce2 MX 400/64MB and is going
with an AMD Barton 2500+ on a Soltek mobo. (Either SL-75FRN2-L or
SL-75MRN-L). We're on a tight budget and will only be putting 256MB DDR400
RAM in it at the moment.

We can't afford to buy a new seperate graphics card at the moment. She is
currently running the above card. For an extra $NZ33.00 she can get the mobo
with integrated GeForce4 MX GPU (and an 8x AGP slot for later).

Is it worth going with the integrated GPU option (for now)? Will it be much
better than her GF2 MX400/64MB? Will the fact that she will only be running
256MB system RAM make a difference as the GF2 has it's own RAM (obviously:)
It could be a while before we can afford a seperate graphics card.

Or should we save the extra $33 and continue to use the GF2? It runs the
games she play at the moment, just adequately.

Actually, the SL-75MRN-L says 8x AGP, her card is only 4x, I'm not even sure
it will run in this board. I can't seem to find a definitive answer on this
one.

Thank you kind people.
 
K

kony

My g/f is upgrading from an old Celly 900/GeForce2 MX 400/64MB and is going
with an AMD Barton 2500+ on a Soltek mobo. (Either SL-75FRN2-L or
SL-75MRN-L). We're on a tight budget and will only be putting 256MB DDR400
RAM in it at the moment.

We can't afford to buy a new seperate graphics card at the moment. She is
currently running the above card. For an extra $NZ33.00 she can get the mobo
with integrated GeForce4 MX GPU (and an 8x AGP slot for later).

I don't know the market there, but I suspect you can almost buy a
GF4MX, possibly used, for that $33NZ. That's about $20US if the
currency calc I just came from is right, but it still doesn't tell us
what hardware is actually worth there.

Is it worth going with the integrated GPU option (for now)? Will it be much
better than her GF2 MX400/64MB? Will the fact that she will only be running
256MB system RAM make a difference as the GF2 has it's own RAM (obviously:)
It could be a while before we can afford a seperate graphics card.

For small games the integrated graphics "might" be faster, but it
could also depend on how heavily the game stresses the memory bus,
since that integrated video is taking away from the total memory
throughput. With medium to larger games you'll be running out of
system memory with the integrated grahics, the GF2MX should be faster
then.
Or should we save the extra $33 and continue to use the GF2? It runs the
games she play at the moment, just adequately.

That's what I would do, there's not much sense in paying extra for
what would only be a slight upgrade IF you had more main system
memory. Keep in mind that the GF2 should be a little faster on the
new platform too, except at higher resolutions. This is just
speculation though, based on past evidence, I've not tried to upgrade
a system having a GF2MX to a Barton then compared the benchmarks, but
you may get an idea from some benchmarks at Tom's Hardware, I
believe there is an article where they compare a ~1GHz Athlon to a
much more modern one (maybe XP2700?) and show the results on many
different video cards.

Actually, the SL-75MRN-L says 8x AGP, her card is only 4x, I'm not even sure
it will run in this board. I can't seem to find a definitive answer on this
one.

Thank you kind people.

The board is backwards compatible, can run any card capable of 4X,
AGP 2.0, but not the old AGP 1, 1X/2X cards.


Dave
 
M

~misfit~

kony said:
I don't know the market there, but I suspect you can almost buy a
GF4MX, possibly used, for that $33NZ. That's about $20US if the
currency calc I just came from is right, but it still doesn't tell us
what hardware is actually worth there.



For small games the integrated graphics "might" be faster, but it
could also depend on how heavily the game stresses the memory bus,
since that integrated video is taking away from the total memory
throughput. With medium to larger games you'll be running out of
system memory with the integrated grahics, the GF2MX should be faster
then.


That's what I would do, there's not much sense in paying extra for
what would only be a slight upgrade IF you had more main system
memory. Keep in mind that the GF2 should be a little faster on the
new platform too, except at higher resolutions. This is just
speculation though, based on past evidence, I've not tried to upgrade
a system having a GF2MX to a Barton then compared the benchmarks, but
you may get an idea from some benchmarks at Tom's Hardware, I
believe there is an article where they compare a ~1GHz Athlon to a
much more modern one (maybe XP2700?) and show the results on many
different video cards.



The board is backwards compatible, can run any card capable of 4X,
AGP 2.0, but not the old AGP 1, 1X/2X cards.

Thanks for that Dave. I'd pretty much come to that conclusion myself. I'm a
tad worried now though, her machine isn't here but the box the graphics card
came in is. It's an Abit Siluro and it doesn't say on it that it *is* 4X. I
just had it in my head that it was. I'll have to check that out somehow. The
mobo she has it in is only 2X so that isn't going to tell me.

Cheers.
 
K

kony

Thanks for that Dave. I'd pretty much come to that conclusion myself. I'm a
tad worried now though, her machine isn't here but the box the graphics card
came in is. It's an Abit Siluro and it doesn't say on it that it *is* 4X. I
just had it in my head that it was. I'll have to check that out somehow. The
mobo she has it in is only 2X so that isn't going to tell me.

Cheers.


All nVidia cards since TNT(2?) were AGP2, 4X cards. There were a few
of those earliest ones that had a design flaw so they couldn't run in
4X mode, but I doubt that an Abit would have this problem, especially
by the GF2 era.


Dave
 
M

~misfit~

kony said:
All nVidia cards since TNT(2?) were AGP2, 4X cards. There were a few
of those earliest ones that had a design flaw so they couldn't run in
4X mode, but I doubt that an Abit would have this problem, especially
by the GF2 era.

Cool. We only bought this card new about a year ago so I'm hoping it's 4X.
It was probably one of the last GF2 cards made. We were broke and her
previous card crapped out and we needed a replacement quickly.

Thanks again.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top