New PC Bought - further recommendations appreciated.

B

BobMarley

I just bought a new system online, after debating for weeks...
I'm always a step or two behind the game, I mean I was upgrading my
4Mb Matrox Millenium to a Viper TNT2 Ultra, about 1 week before the
Viper2 came out, and the GeForce chips...
So this is my first foray into a realtively decent gaming
machine.. after getting hooked on Battlefield 1942, and getting 10
guys together for an upcoming Call of Duty game (yeah, the wife is
REAL excited about that one! hahahaha too bad!)

What other upgrades could I make to my system that I would
need or may have overlooked?

- Asus P4C800-E Deluxe
- Intel 2.8Ghz, 800Mhz FSB
- 512Mb Kingston DDR400/PC3200 Ram
- Xaser III V1420A Case
- Asus V9180 vid card (a little old, but for $50, I'd rather buy this
to get me started, and get a new card based on the 5950 chipset in 2-3
weeks)
- WD 80Gb Special Edition Hdd (ATA) (going to use this for OS only,
and add mirrored SATA 250Gb WD drives in 2-3 weeks)

not sure what else i could add. the 2.8Ghz proc is retail version...
so it's got a heatsink/fan.. i could replace w/ a Vantec or something?

I dunno, i appreciate anyones recommendations
thanks!!
 
B

BobMarley

Also want to add an Audigy 2 Platinum Pro.. but again, waiting to
order that one til i pay this off next month.
 
P

Philip Callan

BobMarley said:
What other upgrades could I make to my system that I would
need or may have overlooked?
- Asus V9180 vid card (a little old, but for $50, I'd rather buy this
to get me started, and get a new card based on the 5950 chipset in 2-3
weeks)

You are aware ASUS now makes ATI chipset graphics cards as well right?
 
B

BobMarley

Yeah... but i had read the 5900 beat the radeon 9800 in some tests...
would you say radeon is better than nvidia though?

the original card i bought, v9180, is just to get me through the next
few weeks. My current Viper TNT2 Ultra, just won't cut it for Call of
Duty, so this should be goot enough for a few weeks...

a coworker just bought the radeon 9800, 256mb... looks awesome.
i'll go for whichever is better.
 
C

Captain Sarcastic

BobMarley said:
Yeah... but i had read the 5900 beat the radeon 9800 in some tests...
would you say radeon is better than nvidia though?

the original card i bought, v9180, is just to get me through the next
few weeks. My current Viper TNT2 Ultra, just won't cut it for Call of
Duty, so this should be goot enough for a few weeks...

a coworker just bought the radeon 9800, 256mb... looks awesome.
i'll go for whichever is better.


High end Radeons are better. Better image quality, better speed, better
drivers & stability, and ATI arn't reducing image quality and other cheats
in order to get misleading scores on popular benchmarks like Nvidia are.

Biggest thing is for DX9 games like Half Life 2 that use pixel shaders. The
Nvidia cards are so far behind on pixel shader performance it's not funny.
 
B

BobMarley

really?
huh.. i had heard nvidia was using some weird cheats but i wasn't sure
it was true..
i'm fine either way w/ ati or nvidia... maybe i'll just get that
sapphire 9800 256mb .... looks awesome
 
A

Andrey Tarasevich

Captain said:
...
and ATI arn't reducing image quality and other cheats
in order to get misleading scores on popular benchmarks like Nvidia are.
...

Nonsense. Both ATI and nVidia used exactly the same cheating techniques
in order "to get misleading scores on popular benchmarks".
 
C

Captain Sarcastic

Andrey said:
Nonsense. Both ATI and nVidia used exactly the same cheating techniques
in order "to get misleading scores on popular benchmarks".


Not true at all. Nvidia's latest drivers don't even do triliner filtering
in D3D when the application asks for it. Nvidia been caught inserting
static clip planes, replacing shaders with low precision versions that
produce inferior IQ, and disabling buffer clears amongst other things, all
in order to get better scores when running a benchmark. It's one of the
reasons Nvidia now encrypts their drivers.

Try doing some research at www.beyond3d.com if you are interested in facts
rather than fanboyism.
 
C

Captain Sarcastic

BobMarley said:
really?
huh.. i had heard nvidia was using some weird cheats but i wasn't sure
it was true..
i'm fine either way w/ ati or nvidia... maybe i'll just get that
sapphire 9800 256mb .... looks awesome


The 256mb version carries a high premium because of the extra memory, and
that memory probably won't give you much of a performance advantage for
another 6-12 months (when the extra memory will become useful for storage
of large numbers of textures in upcoming games). A 9800XT (128 or 256 mb
versions) will still be the best if you are looking at the top end.
 
B

BoB

The fat lady ain't sung yet with the ati/nvidia shoot-out!
Bad press with Valve's blasting of nvidia is linked to the deal cut
to include HL2 with highend ati cards, nvidia's driver team was hired away
by ati, it killed them!
Since the idiots at Valve had their systems hacked and code was stolen,
Nvidia may have time to get their 5.xx drivers ready for prime time!
Should be an interesting holiday season!
 
R

Roland Scheidegger

Captain said:
The 256mb version carries a high premium because of the extra memory,
and that memory probably won't give you much of a performance advantage
for another 6-12 months (when the extra memory will become useful for
storage of large numbers of textures in upcoming games). A 9800XT (128
or 256 mb versions) will still be the best if you are looking at the top
end.
The really high-end cards generally have a very low bang-for-the buck
ratio. A 9800XT is (official price, but the retail price follows the
same trend) 500 USD. A 9800 Pro 128MB is 400 USD and is only sightly
slower in most of todays games (exceptions are some games when run in
very high resolution (1600x1200) AND high level of FSAA (4x,6x) at the
same time). And a 9800 non-pro (but don't get the SE!) is only 300USD
and still offers about 80% of the performance of that high-end XT.
Same is true for Nvidia, there are lots of different 5900 models out,
some of them are only half the price of that fx5950 ultra but the
performance difference isn't that much.
Better invest the money in some more ram, you want to equip your board
with 2 modules anyway (cause there is some performance loss in single
channel mode), though another 512MB module "unfortunately" won't cost
200USD...

Roland
 
A

Andrey Tarasevich

Captain said:
Not true at all. Nvidia's latest drivers don't even do triliner filtering
in D3D when the application asks for it. Nvidia been caught inserting
static clip planes, replacing shaders with low precision versions that
produce inferior IQ, and disabling buffer clears amongst other things, all
in order to get better scores when running a benchmark. It's one of the
reasons Nvidia now encrypts their drivers.

The code that performs the detection of 3DMark2001 was found in both
nVidia and ATI drivers. It is actually not hard to find it because it is
fairly obvious. And it is pretty easy to throw off the results of
Direct3D initialization sequence analysis performed by both drivers in
order to prevent them from activating their "optimizations". For
example, the differences between the real and the "optimized" version of
a scene from 3DMark2001's test 4 for both drivers can be downloaded here

http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar
http://www.ixbt.com/video2/images/antidetect/r300-difference.rar

It doesn't take a genius to realize that there's not much difference
between "optimization" techniques used by nVidia and ATI (there are some
differences though).
Try doing some research at www.beyond3d.com if you are interested in facts
rather than fanboyism.

I'd suggest that you should do some _real_ research, instead of reading
all this "fan fiction". This is not rocket science. If you have half a
brain in your head you can reproduce all these results yourself.
 
R

Roland Scheidegger

Andrey said:
The code that performs the detection of 3DMark2001 was found in both
nVidia and ATI drivers. It is actually not hard to find it because it is
fairly obvious. And it is pretty easy to throw off the results of
Direct3D initialization sequence analysis performed by both drivers in
order to prevent them from activating their "optimizations". For
example, the differences between the real and the "optimized" version of
a scene from 3DMark2001's test 4 for both drivers can be downloaded here

http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar
http://www.ixbt.com/video2/images/antidetect/r300-difference.rar

It doesn't take a genius to realize that there's not much difference
between "optimization" techniques used by nVidia and ATI (there are some
differences though).

Yes, but you're talking about 3dmark01, which is old (that's no excuse
for cheating, but 3dmark01 seems to be pretty much the only application
ati cheats). In newer titles it seems only nvidia cheats so far (do I
really need to mention the 3dmark03 fiasco with the static clip planes?)

Roland
 
N

Nobody_of_Consequence

Having Valves's code is not going to solve Nvidia's non compliance
with Dx9. Well unless the cheat.
 
C

Captain Sarcastic

BoB said:
The fat lady ain't sung yet with the ati/nvidia shoot-out!
Bad press with Valve's blasting of nvidia is linked to the deal cut
to include HL2 with highend ati cards, nvidia's driver team was hired away
by ati, it killed them!
Since the idiots at Valve had their systems hacked and code was stolen,
Nvidia may have time to get their 5.xx drivers ready for prime time!
Should be an interesting holiday season!


You've got it the wrong way round. Both ATI and Nvidia offered $6 million
for the rights to bundle HL2. Valve *chose* ATI because the ATI tech was
better. Same reason Microsoft chose ATI for Xbox2. They were also pissed
off having spend fives times as long optimising a special NV3x code path
only to find the NV35 still runs the DX9 path at half the speed of the R3x0
cards.
 
C

Captain Sarcastic

Andrey Tarasevich wrote:

The code that performs the detection of 3DMark2001 was found in both
nVidia and ATI drivers. It is actually not hard to find it because it is
fairly obvious. And it is pretty easy to throw off the results of
Direct3D initialization sequence analysis performed by both drivers in
order to prevent them from activating their "optimizations". For
example, the differences between the real and the "optimized" version of
a scene from 3DMark2001's test 4 for both drivers can be downloaded here

http://www.ixbt.com/video2/images/antidetect/nv25-difference.rar
http://www.ixbt.com/video2/images/antidetect/r300-difference.rar

It doesn't take a genius to realize that there's not much difference
between "optimization" techniques used by nVidia and ATI (there are some
differences though).

Try looking at anything newer than three years old, maybe something that
stresses a graphics card instead of the CPU. Try looking at the difference
between a real shader optimisation and hacking out the developer's code and
inserting a low quality version that lowers the IQ like Nvidia do. Nvidia
have been targeting benchmarks and using hacks that cannot be used in games
(such as static clip planes) in order to make their cards score better on
benchmarks than they can run the games, all whilst lying to the public.

As I said, you can't even get trilinear filtering on any D3D app now,
because Nvidia have thrown in away to get speed. Of course Nvdia still
pretends that you are getting trilinear (something that has been standard
for years) but instead you get only bilinear.

I'd suggest that you should do some _real_ research, instead of reading
all this "fan fiction". This is not rocket science. If you have half a
brain in your head you can reproduce all these results yourself.

Says they guy pointing at a 3 year old benchmark while I'm talking about
today's tech like pixel shaders.
 
A

Andrey Tarasevich

Captain said:
Says they guy pointing at a 3 year old benchmark while I'm talking about
today's tech like pixel shaders.
...

It is completely irrelevant how old the benchmark is. It is not about
benchmarks, it is about the benchmark-detecting code that is present in
_today's_ ATI drivers. 3DMark2001 is nothing more that just an example
that immediately demonstrates that ATI's drivers (just like nVidia's
drivers) perform benchmark detection and carry out some questionable
"optimizations" based on the results of the detection. Continue digging
and you'll find there "optimizations" for 3DMark2003 and other modern
benchmarks.
 
B

BoB

Valve has delayed HL2 because of the code theft(or at least they claim),
resulting in a breather for nvidia, when the game finally hits,
they(nvid) better have some workable dx9 drivers for all those highend
cards! Nvidia didn't steal the code! Get real!
If you want to talk about cheats read the details of Valve's
prerequisites for testing!
 
B

BoB

Granted ATI is on top right now, it's more a driver issue than
the capability of the hardware. It was a different story a couple of years
ago, Nvidia wiped the floor with ATI. ATI responded, adopted a unified
driver structure(about damn time) and has started regaining market share.
What does the bundling contract have to do with anything, but the
fact that Valve has an interest in keeping ATI on top?
My son's lan parties and BF1942 showed me that you don't really need a
300$ vid card, just a damn good cpu and a lot of ram!
 
C

Captain Sarcastic

BoB said:
Granted ATI is on top right now, it's more a driver issue than
the capability of the hardware. It was a different story a couple of years
ago, Nvidia wiped the floor with ATI. ATI responded, adopted a unified
driver structure(about damn time) and has started regaining market share.

It's not a driver issue - it's hardware. If it was a driver issue, Nvidia
would have fixed it long ago, as they've been working on NV3x drivers for
over a year now.

Nvidia pretty much designed NV3x prmarily for DX8.1 (ie Doom 3), which is
why it is not that good at DX9. It's just Nvidia's bad luck that they
though they were big enough to dominate the direction of the market at the
same time as ATI brought out a much better DX9 part.

What does the bundling contract have to do with anything, but the
fact that Valve has an interest in keeping ATI on top?

Nothing, so when people say "Valve only slated Nvidia because of ATI's
money" it's not true.

My son's lan parties and BF1942 showed me that you don't really need a
300$ vid card, just a damn good cpu and a lot of ram!

No, you can spend a lot less and get a midrange card and still have a good
experience, but we were talking about someone who was looking to buy at the
top end.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top