When are the upgrades necessary?

C

Cyde Weys

So, I'm running on a sort of old computer. I have a Socket A mobo with no
onboard SATA. I've been indefinitely putting off an upgrade for over a
year now that would include faster RAM, a 64-bit Athlon processor, and a
better video card, but I've just found no way to justify the expense of
such a purchase. I can play pretty much anything. Sure, I can't touch the
highest resolutions, but that frankly seems to make so little difference in
the enjoyment I receive from games. The biggest perceived benefit would be
getting another hard drive, which is what I'm doing. 660GB just isn't
enough.

So, when do the upgrades become necessary? I remember I used to have to
upgrade a lot. Or, maybe I didn't have to, but I did. Remember back when
processor speeds were actually doubling every two years? And the latest
trend in videocards seems to be getting two of them and strapping them
together. Ughhh.

Here are the specifics of my system, in case you're interested:

AMD Athlon 2500+
ATI Radeon 9800 Pro
1GB DDR RAM @ 333Mhz
K7N2 Delta2-FSR mobo
Audigy 2 ZS sound card
crapload of hard drives (some SATA, some not, some RAID, some not)
Dual DVD burners (NEC ND-1100 and NEC ND-2510A)

Besides the hard drive upgrade I guess I'll get a newer DVD burner,
something in the ND-35XX series. My ND-2510A is theoretically capable of
burning DVD-R dual-layer discs but when I try I mostly just end up making
expensive coasters (and yes, I've upgraded the firmware and drivers). DVD
burners are amazingly cheap anyway.

Has anyone else noticed this apparent downturn in the computer industry or
is it just me? Nothing just seems worth upgrading except for the storage
media.
 
J

johnS

So, when do the upgrades become necessary? I remember I used to have to
upgrade a lot. Or, maybe I didn't have to, but I did. Remember back when
processor speeds were actually doubling every two years? And the latest
trend in videocards seems to be getting two of them and strapping them
together. Ughhh.

You usually upgrade when you PC cant run basic things you want it to
run. That usually happened in the "old days" when there were radical
upgrades to WINDOWS and basic video standards were evolving. Like the
jump to WIN 95 and XP. Some newer versions of programs you might need
to run may only come in newer win versions and new peripherals may not
bother to come out with older drivers etc. I also had this Epson
scanner 1250 that worked under older WIN versions but for some reason
went bonkers no matter what I did with WIN XP.

The other big areas are games.

The other reason is you have extra money running around and you just
feel like upgrading one in a while.

You say you can run all the games you want at the res you want and
thats all that matters. However I upgraded a lot the last 2 years
where I used a 400mx for a long time cause I didnt want to spend that
much on graphics cards etc then within 1.5 years went from a 9800 Pro,
6800 AGP, 800XL PCI express and now a 7800GT pci exp. Ive only done
that cause I can sell the older cards and just add a bit more to get
the better card.

What shocked me was in the past I swore Id never spend more than 200
on a card cause they lose value too fast. However I thought if I ever
bought a 300 buck card like the 7800GT Id probably be able to really
max everything out. Nope. Todays games you can really throw all you
got at them and they can still take more. AA and other effects at
high res can really bog almost anything down.

Theyve already Ive read discontinued the 6600 , 6800 and 7800 lines
and will concentrate on the 7600 and 7900. Theyll still probably be
around for a while like the socket As but will probably start to
disappear slowly. Im starting to see clearance sales on these cards
now.

An acceptable level of performance is a really subjective thing. It
depends on when you think its too slow. Itll probably come around if
you suddenly have to play some new game and find your system cant cut
it at even the minimal res you want to play it in or you cant get
certain effects with it or some graphics prog etc doesnt run well that
you have to use.

I mean youve got the usual tech obsolesence thing going on as we go
past summer and hit the next year but then you add on win VISTA the
first big Win upgrade in a while and the usual string of new games
which will probably come out as summer/sept comes around in time for
Xmas. And youll have additionally the new INTEL chips and the M2 AMDs
so youll be several generations behind. If you dont feel you need to
now Id wait until the end of the year or next year or so then youll
have a lot better idea of how all these changes stand in relation to
your present system.
 
K

kony

So, I'm running on a sort of old computer. I have a Socket A mobo with no
onboard SATA. I've been indefinitely putting off an upgrade for over a
year now that would include faster RAM, a 64-bit Athlon processor, and a
better video card, but I've just found no way to justify the expense of
such a purchase. I can play pretty much anything. Sure, I can't touch the
highest resolutions, but that frankly seems to make so little difference in
the enjoyment I receive from games. The biggest perceived benefit would be
getting another hard drive, which is what I'm doing. 660GB just isn't
enough.

So, when do the upgrades become necessary? I remember I used to have to
upgrade a lot. Or, maybe I didn't have to, but I did. Remember back when
processor speeds were actually doubling every two years? And the latest
trend in videocards seems to be getting two of them and strapping them
together. Ughhh.

Here are the specifics of my system, in case you're interested:

AMD Athlon 2500+
ATI Radeon 9800 Pro
1GB DDR RAM @ 333Mhz
K7N2 Delta2-FSR mobo
Audigy 2 ZS sound card
crapload of hard drives (some SATA, some not, some RAID, some not)
Dual DVD burners (NEC ND-1100 and NEC ND-2510A)

Besides the hard drive upgrade I guess I'll get a newer DVD burner,
something in the ND-35XX series. My ND-2510A is theoretically capable of
burning DVD-R dual-layer discs but when I try I mostly just end up making
expensive coasters (and yes, I've upgraded the firmware and drivers). DVD
burners are amazingly cheap anyway.

Has anyone else noticed this apparent downturn in the computer industry or
is it just me? Nothing just seems worth upgrading except for the storage
media.


Yes, it takes some pretty demanding uses to exceed what the
above system is capable of. I have a few similar that I
don't mind using at all for most tasks. Radeon 9800 Pro can
struggle at 1280x1024 (native) LCD resolutions in gaming
unless eyecandy is turned down some but sometimes I find
reviewers making mountains out of molehills when it comes to
eyecandy- in some video card forums they'll want screenshots
to compare card differences and looking at some, so long as
the texture quality was medium or high and the FSAA was
higher than 2X, it tends to look reasonable, enoguh that
further differnces would be ignored when focusing on the the
gameplay itself.

I too find storage needs increasing- as do most who keep
data over time. Some just generate fewer GB per year than
others and are lucky in that single HDD capacities and just
a few DVD are enough for their needs. Others working with
video can eat up a few GB in a few hours.

I know a lot of people who have no interest in upgrading who
have a lesser system than yours. Around P3/Celeron 800MHz
they were satisfied and just want to keep their present
system working, will replace it when a costly enough part
fails. Gamers are a different story though, never enough
power and even then if LCDs hadn't come along they might've
been satisfied with less but today's high-res LCD panels can
simply demand more.

Then there's those that just like to play with newer
technology. I feel that way sometimes, but not a lot of
interest has come along recently except for PCIe and I'm
still waiting for a good, competitive selection of cards for
PCIe. Just wait a year though, Windows Vista is coming out
and a lot of people will be wanting to try it, maybe run it
just to feel uber/modern/whatever, and it'll do as all past
MS operating systems have, require a system upgrade for the
same perceived speed in common uses. Gaming may not be hit
so bad if at all but then games will get more demanding over
time as well.

I suppose I'd say that the real need to upgrade for most
uses is lower, less frequent, but you're bound to find
something eventually that you want to do faster. Voice
control, virtual reality, two big areas where system
performance will be at issue. Plus no matter how fast a
system is, somebody will come along and say "but WHILE
you're doing that, can you do this OTHER thing
simultaneously?". So we are near the end of an era when
it's a matter of what a system can do and entering one of
how many things it can do at once. Makes a bit of sense
though with dual cores and the inevitable reality that the
big ole main PC becomes a sort of hub in a home and all the
lesser more proprietary devices have special purposes.
 
N

Noozer

So, when do the upgrades become necessary? I remember I used to have to
When are upgrades "necessary" ???

There are only two times when an upgrade is necessary... The first is if the
current computer just can't do what you need it to. The second is if the
computer breaks down and the repair requires a part that isn't available or
is very expensive.

I'd say that 80% of upgrades are just to keep up with what's current and
beause of "want" and not "need" (he says, staring at his 24" LCD monitor)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top