As strange as it sounds, a quick bit of peliminary research I have conducted suggests that it may be that case that there are instances where a Voodoo 3 can outperform the latest nVidia graphics chips!
Here's the story:
I was upgrading my system (a lowly Athlon 800, 256MB Ram....) because I was sick of Jedi Outcast crashing on me
I decided to get a new graphics card.
I benchmarked the graphics before and after the upgrade. On final reality (a Direct X 5 benchmark I believe), my old Voodoo did very well, scoring overall something like 5.78 (I fail to remember eactly). And on 3dMark 2001SE (Direct X 8) I got a rather poor 1400 or so.
After the GF4 ti4400 was in, 3dMark score shot up to 6733 (impressive! beat my boyfriends 1.4Athlon, 256MB, GF3 ti300 system :spin: ) however, it only scored about 5.28 on final reality (I remember it was exactly 0.5 marks difference, which is significant given that this shows through over 5 repeats of each benchmark, and each "point" is equal to an S3 Vierge P150s rating - which were all the rage when this benchmark was young). So that means that on a direct X5 test, the new card lost about 50% of the performance an S3 Vierge P150 system would put out, and if you remember, they could do quite a lot of rendering and processing on those old systems...
This is anacdotal of course, so I'd love to klnow if anyone had observed a similar effect.
Moral - if your system isn't super powered, and your games are built for an old Direct X version, a new graphics card not optimised for that old software, could reduce the performance.
Thank goodness I don't play those games anymore
Here's the story:
I was upgrading my system (a lowly Athlon 800, 256MB Ram....) because I was sick of Jedi Outcast crashing on me

I benchmarked the graphics before and after the upgrade. On final reality (a Direct X 5 benchmark I believe), my old Voodoo did very well, scoring overall something like 5.78 (I fail to remember eactly). And on 3dMark 2001SE (Direct X 8) I got a rather poor 1400 or so.
After the GF4 ti4400 was in, 3dMark score shot up to 6733 (impressive! beat my boyfriends 1.4Athlon, 256MB, GF3 ti300 system :spin: ) however, it only scored about 5.28 on final reality (I remember it was exactly 0.5 marks difference, which is significant given that this shows through over 5 repeats of each benchmark, and each "point" is equal to an S3 Vierge P150s rating - which were all the rage when this benchmark was young). So that means that on a direct X5 test, the new card lost about 50% of the performance an S3 Vierge P150 system would put out, and if you remember, they could do quite a lot of rendering and processing on those old systems...
This is anacdotal of course, so I'd love to klnow if anyone had observed a similar effect.
Moral - if your system isn't super powered, and your games are built for an old Direct X version, a new graphics card not optimised for that old software, could reduce the performance.
Thank goodness I don't play those games anymore
