budget gaming PC performs well vs. high-end

B

Beladi Nasralla

I made a budget gaming computer -- AMD X2 Athlon 3600+ and 7600GT
(both overclocked). I ran Oblivion at playable framerates at almost
maximum settings. My colleague says that his new computer with 8800GTS
struggles to run Oblivion at maximum settings, and he wonders why I
can ran it on my computer. I run Bioshock at maximum settings, and the
game has no speed problems (that is, it runs at least at 20 fps). And
yet the reviews show that the high-end cards (like 1950Pro and 7900GT)
struggle to run the game at full settings.

What is happening ? Maybe my GPU card is better what it is claimed to
be (it is a grey import from China). The other consideration is that
both mainboard and the GPU are from the same manufacturer, MSI, and
the mainboard has an nVidia cheapset (like the videocard). Maybe the
GPU and the mainboard have an improved compatibility, which results in
an improved performance ?

This looks less likely so, because I ran the tests Mark3D03 and
Mark3D05. It came out that the unoverclocked computer had a
performance at the bottom of the surveyed computer systems of the
equal specifications. After overclocking, my computer performed the
test very closely to the median of the similar systems they tested.

That gives ?
 
P

Patrick Vervoorn

I made a budget gaming computer -- AMD X2 Athlon 3600+ and 7600GT
(both overclocked). I ran Oblivion at playable framerates at almost
maximum settings. My colleague says that his new computer with 8800GTS
struggles to run Oblivion at maximum settings, and he wonders why I
can ran it on my computer. I run Bioshock at maximum settings, and the
game has no speed problems (that is, it runs at least at 20 fps). And
yet the reviews show that the high-end cards (like 1950Pro and 7900GT)
struggle to run the game at full settings.

You don't mention the resolution at which you are running Bioshock vs the
resolution at which your colleague is running it...?

Perhaps you also have a different view of what 'no speed problems' are?
While you may find Bioshock at 20fps to be acceptable, your colleague may
have a different view, and may find 30fps to be totally unacceptable...?
What is happening ? Maybe my GPU card is better what it is claimed to
be (it is a grey import from China). The other consideration is that
both mainboard and the GPU are from the same manufacturer, MSI, and
the mainboard has an nVidia cheapset (like the videocard). Maybe the
GPU and the mainboard have an improved compatibility, which results in
an improved performance ?

I don't think that changes much, if anything.
This looks less likely so, because I ran the tests Mark3D03 and
Mark3D05. It came out that the unoverclocked computer had a
performance at the bottom of the surveyed computer systems of the
equal specifications. After overclocking, my computer performed the
test very closely to the median of the similar systems they tested.

That gives ?

I think you're comparing apples to oranges... Try to find out what
resolution other people (or the benchmark results you're comparing to) run
at.

I also have a P4-2400 system with a 7600GT. It was quite capable of
running something like Half Life 2 or Bioshock at an acceptable framerate
when I was still using a CRT, and used a resolution like 800x600 or
1024x768 with quite a lot of the eye-candy enabled. However, when I
switched to a TFT, with a native resolution of 1680x1050, I noticed this
system had considerable problems rendering all these pixels with the same
quality settings I used before.

Regards,

Patrick.
 
B

Beladi Nasralla

You don't mention the resolution at which you are running Bioshock vs the
resolution at which your colleague is running it...?

I run it with the resolution 1440x900. This is about 1.3 MP. This is
also the amount of pixels in the 1240x1024 screen which had been de-
facto for LCD screens. When I was buying my LCD, I did not pay much
attention to th eresolution, and bought what I have now; I was later
sorry that I did not instist when dealing with the salesperson, and
did not get a 1650x1050 screen. It has 1.6 MP. But there is a positive
side to it, too. With a 1650x1050 screen, my games would run 20%
slower than now ith a 1440x900 screen...
 
P

Patrick Vervoorn

On Oct 19, 5:45 pm, Patrick Vervoorn


I run it with the resolution 1440x900. This is about 1.3 MP. This is
also the amount of pixels in the 1240x1024 screen which had been de-
facto for LCD screens. When I was buying my LCD, I did not pay much
attention to th eresolution, and bought what I have now; I was later
sorry that I did not instist when dealing with the salesperson, and
did not get a 1650x1050 screen. It has 1.6 MP. But there is a positive
side to it, too. With a 1650x1050 screen, my games would run 20%
slower than now ith a 1440x900 screen...

Yes, and now find out at what resolution your colleague is running, and
only then can you make a meaningful comparison between your system and
his...

Regards,

Patrick.
 
C

Conor

I made a budget gaming computer -- AMD X2 Athlon 3600+ and 7600GT
(both overclocked). I ran Oblivion at playable framerates at almost
maximum settings. My colleague says that his new computer with 8800GTS
struggles to run Oblivion at maximum settings, and he wonders why I
can ran it on my computer. I run Bioshock at maximum settings, and the
game has no speed problems (that is, it runs at least at 20 fps). And
yet the reviews show that the high-end cards (like 1950Pro and 7900GT)
struggle to run the game at full settings.

What is happening ?

You're running the game using DirectX 9 and he's running the game
running DirectX 10. If he forced the game to run in DirectX 9, it'd
trounce yours.
 
C

Conor

Patrick said:
Yes, and now find out at what resolution your colleague is running, and
only then can you make a meaningful comparison between your system and
his...
Irrelevent. His friend is running DirectX 10.
 
J

John Doe

Beladi Nasralla said:
I made a budget gaming computer -- AMD X2 Athlon 3600+ and 7600GT
(both overclocked).
... I ran the tests Mark3D03 and Mark3D05. It came out that the
unoverclocked computer had a performance at the bottom of the
surveyed computer systems of the equal specifications. After
overclocking, my computer performed the test very closely to the
median of the similar systems they tested.

That gives ?

You're mistaken?
 
B

Beladi Nasralla

Yes, and now find out at what resolution your colleague is running, and
only then can you make a meaningful comparison between your system and
his...

He is running 800x600 on his CRT...
 
F

Frank McCoy

In alt.comp.hardware.pc-homebuilt Patrick Vervoorn
You don't mention the resolution at which you are running Bioshock vs the
resolution at which your colleague is running it...?

Perhaps you also have a different view of what 'no speed problems' are?
While you may find Bioshock at 20fps to be acceptable, your colleague may
have a different view, and may find 30fps to be totally unacceptable...?


I don't think that changes much, if anything.


I think you're comparing apples to oranges... Try to find out what
resolution other people (or the benchmark results you're comparing to) run
at.

I also have a P4-2400 system with a 7600GT. It was quite capable of
running something like Half Life 2 or Bioshock at an acceptable framerate
when I was still using a CRT, and used a resolution like 800x600 or
1024x768 with quite a lot of the eye-candy enabled. However, when I
switched to a TFT, with a native resolution of 1680x1050, I noticed this
system had considerable problems rendering all these pixels with the same
quality settings I used before.
Funny:
I have a similar situation to what the person you replied to has.

*My* computer has an AMD 2400+ with 1-gig (2 sticks) PC3200 memory in it
and an ATI 2006 "All In Wonder" card running an LCD screen (using the
VGA connector) at 1680x1050 pixels. I'm running memory-speed of 200mhz,
FSB of 266, and 133 bus-speed.

*The kid* has a different make (and supposedly much *faster*)
motherboard, with matched 500meg (again, 1 gig) memory for 128-bit
access instead of 64-bit like mine (it *requires* matched sticks). That
PC has an AMD 2800+ CPU, and a later (supposedly faster) video board
(also from ATI) without the extra bells and whistles of the AIW card.
That system has a 21" monitor, which normally is run at 1600x1200.

So ... You'd *expect* that the kid's computer would walk all over mine.
Only instead, while playing World of Wonder, mine *screams* along at
full resolution and all settings at max; while the kid's computer has to
be backed-off in resolution and/or settings to get decent playing speed.

Go figure.

Sometime I'm going down there, taking along a copy of CPU-Z, and see if
the kid's memory settings or something is off. No way should mine walk
all over the kid's ... but it does. You'd think it would be the other
way around.
 
F

Frank McCoy

You're running the game using DirectX 9 and he's running the game
running DirectX 10. If he forced the game to run in DirectX 9, it'd
trounce yours.

Hmmm ... Does DirectX 10 run under Win-XP?
Both of our computers are still XP.
(Wouldn't HAVE Vista!)
If so, that might be the difference with mine as well.
The kid might have that.

So ... how do you (as you say) "force the game to run under DirectX 9"?
I thought once you had an "upgrade" to DirectX installed, you couldn't
pull it out without completely reinstalling Windows.

I also thought the whole *idea* of "upgrading" DirectX was to get
*faster* performance of video stuff, not slower.
 
C

Conor

Frank McCoy said:
I also thought the whole *idea* of "upgrading" DirectX was to get
*faster* performance of video stuff, not slower.
No. the whole idea of upgrading DirectX is to get more visual effects.
That translates into lower framerates as the graphics card has way more
things to display for the same frame.
 
F

Frank McCoy

No. the whole idea of upgrading DirectX is to get more visual effects.
That translates into lower framerates as the graphics card has way more
things to display for the same frame.

Well, if the performance degrades that badly, how do you go back?
(Or can you without reloading Windows completely?)
 
J

johns

I can't prove this, but what I suspect is that the Dx10 cards
running under XP, run in software only. That would make them
pretty slow, and also force them to cut back on graphics
quality. I have some Dx10 cards coming in for testing,
and I will post on the result. I will compare them to a
7950 GTS. The Dx10 cards are 8600 GTS , and are
suppose to be right much faster than the 7950. However,
I noticed in benchmarks on the web, that the 7950 was
nearly equal or better in most tests under XP. The PC
I will use is one monster of a machine, and with a 7950
in it, AquaMark3D benched at better than 133,000. I'll
have the 8600 mark sometime late next week. If anybody
knows about possible traps in this test, let me know
in a separate post, and we can rant about it there ...
like driver issues.

johns
 
R

RF

johns said:
I can't prove this, but what I suspect is that the Dx10 cards
running under XP, run in software only. That would make them
pretty slow, and also force them to cut back on graphics
quality. I have some Dx10 cards coming in for testing,
and I will post on the result. I will compare them to a
7950 GTS. The Dx10 cards are 8600 GTS , and are
suppose to be right much faster than the 7950. However,
I noticed in benchmarks on the web, that the 7950 was
nearly equal or better in most tests under XP. The PC
I will use is one monster of a machine, and with a 7950
in it, AquaMark3D benched at better than 133,000. I'll
have the 8600 mark sometime late next week. If anybody
knows about possible traps in this test, let me know
in a separate post, and we can rant about it there ...
like driver issues.

johns

No, they don't run in software mode. They run in DirectX 9 hardware mode.
There are no features, hardware OR software, of DX10 supported in XP at all.
There is no 'software DX10 mode.'

And, the 8600GTS is not "much faster than the 7950". The 8600's are lower
mid-range cards (even the GTS.) The 7950 should trounce the 8600GTS in
almost any DirectX 9 game. Unfortunately Tom's Hardware doesn't have the
7950GTS nor the 8600GTS in their testing library, but when comparing the
7950GT vs. the 8600GT, the 7950GT varies from 300% to 50% faster in the
1280x1024 resolution game tests I looked at.

If you want a DX10 card right now, only the 8800's are worth it, in my
opinion.

Of course, in my opinion, since nVidia is releasing new cards later this
year.. the best thing to do is wait, especially if you already own a top-end
DX9 card and XP only.

RF.
 
J

johns

Rats. I kind of figured that. Problem is BFG seems to
have discontinued their 7950, and the company I am
buying from recommended the 8600 as a replacement.
My bosses went with it, and did not ask me. So I'm
still going to bench the 8600, and stick the "slow down"
figures right under their know-it-all noses. I knew that
7950 was the better card. I just don't understand BFG
dropping it ????

johns
 
L

Les Matthew

Conor said:
You're running the game using DirectX 9 and he's running the game
running DirectX 10. If he forced the game to run in DirectX 9, it'd
trounce yours.

I thought Vista had both 9 and 10. DirectX9 games use 9 and DirectX10 games use 10?


les...
 
K

Kurt Herman

That is correct. DX10 games use DX10 IF you have a DX10 card, in Vista,
otherwise DX9.

Kurt
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top