AMD X2: is the 2nd core used -- how do I know ??

  • Thread starter carrera d'olbani
  • Start date
O

OldDog

Marcus Redd said:
What would you Google for?

Try searching on "Performance rating for computers"

http://news.zdnet.co.uk/hardware/0,1000000091,2121176,00.htm
"We've been working with industry leaders today to propose a solution... to
come up with a better way for end users to evaluate what they're really
getting," Patrick Moorhead, vice president of consumer advocacy for AMD,
said on Wednesday -- the same day AMD introduced two new Athlon XP desktop
PC processors. Moorhead said AMD is seeking feedback from software
developers, as well as from other PC-component makers.

http://news.com.com/Will+your+PC+keep+pace+with+Vista/2100-1016_3-6050116.html
The above is Microsofts slant on performance rating...

The "Windows Performance Rating," which can be seen in the latest test
version of the operating system, evaluates components such as the processor,
the memory, the hard drive and graphics cards to come up with an overall
score.


http://www.quepublishing.com/articles/article.asp?p=339099&rl=1

http://www.xbitlabs.com/news/cpu/display/20030625113439.html
 
E

Ed Cregger

John Weiss said:
No.

The AMD CPU architecture is significantly different, and more efficient,
than the old Pentium 4 architecture. The 1.9 GHz AMD is equivalent to an
old P4 running at 3.6 GHz, according to AMD's estimates. THAT is the
source of the "3600" designation.

FWIW, the new Intel "core" (as in Core2Duo) architecture is also of a more
efficient variety, so their clock speeds have come down significantly from
the P4 as well.



What was your old computer? What background apps was it running, compared
with the new one?

Game performance these days relies on the GPU as well as the CPU. Some
parts of it may be tied to absolute CPU clock speed, while other aspects
are tied more to GPU performance or memory bandwidth...

While the game itself is only using 1 CPU core, the OS can shift other
background tasks to the other core.



The answer is "Maybe..."

The Core2Extremes are arguably the best performing machines around. OTOH,
if you're on a budget, you have to balance CPU, RAM, and GPU.

If all you want to do is play a current-generation single-CPU-aware game,
maybe a higher clock speed single-core CPU would have been better. For
general use, though, dual-core CPUs have the edge.


My 32 bit 2.8 GHz Dell computer smokes my eMachines AMD 64 3200+ machine.


Ed Cregger
 
L

Les Steel

Ed said:
My 32 bit 2.8 GHz Dell computer smokes my eMachines AMD 64 3200+ machine.


Ed Cregger
Smokes it at what? Also what is the make up of your "32 bit 2.8 GHz
Dell" and your "eMachines AMD 64 3200+"

I guarantee my 32 bit* 2.4Ghz no name** PC will "smoke" yours.

*32 bit XP
**no name as in homebuilt.
 
C

carrera d'olbani

dual-core is *definitely* the way to go so you made the right choice - no
question about it. ....

Most games that currently support multicores only use coarse threading -
offloading one or two functions to the 2nd core.

Well, HL2 runs fine on my machine (AMD 64 X2 3600+) with its 1.9 GHz
first core. But perhaps I am glad that I got a dual core processor in
my homebuilt computer instead of single core. It is a novelty feeling
for me to see that the computer can switch between the applications
(e.g. game and word processor) smoothly. For the future games which
rely on a single core processor my computer will probably slow. I have
a GeForce 7600GT card, and it runs smoothly on standard screen
(1280x1024), and it has a grunt up its sleeve. I have in post a new
wide LCD monitor coming (LG L194WT, 1440x900). The only thing I am
sorry is that I did not buy a larger (1680x 1050) monitor LG L205WD. I
played HL2 and Q4 with the vertical resolution 1024, and the pictured
human characters looked crisp and sharp. I played with the vertical
resolution 864 pixels (as in 1156x864), and the pciture looked crap
(not crisp anymore). The vertical resolution 960 pixels (as in
1280x960) gave a semi-crisp picture in both games. This is the
vertical resolution which close to the vertical resolution of my soon-
to-arrive monitor LG L194WT. Oh crap, I cannot do anything about it
now :-(
 
O

OldDog

My 32 bit 2.8 GHz Dell computer smokes my eMachines AMD 64 3200+ machine.


Ed Cregger

Smokes it in what?
Benchmark X?
Frames per sec in Quake IV?

Do both computers have the same amount of RAM, and what kind of RAM is
installed, are they both using the same video card, ... ?
 
S

Sleepy

carrera d'olbani said:
Well, HL2 runs fine on my machine (AMD 64 X2 3600+) with its 1.9 GHz
first core. But perhaps I am glad that I got a dual core processor in
my homebuilt computer instead of single core. It is a novelty feeling
for me to see that the computer can switch between the applications
(e.g. game and word processor) smoothly. For the future games which
rely on a single core processor my computer will probably slow.

*any* future game should have multi-core support built in - if it doesn't
then
that's just crap programming.
I have
a GeForce 7600GT card, and it runs smoothly on standard screen
(1280x1024), and it has a grunt up its sleeve. I have in post a new
wide LCD monitor coming (LG L194WT, 1440x900). The only thing I am
sorry is that I did not buy a larger (1680x 1050) monitor LG L205WD. I
played HL2 and Q4 with the vertical resolution 1024, and the pictured
human characters looked crisp and sharp. I played with the vertical
resolution 864 pixels (as in 1156x864), and the pciture looked crap
(not crisp anymore). The vertical resolution 960 pixels (as in
1280x960) gave a semi-crisp picture in both games. This is the
vertical resolution which close to the vertical resolution of my soon-
to-arrive monitor LG L194WT. Oh crap, I cannot do anything about it
now :-(

high resolutions like that are asking a lot of a 7600GT because its only
128bit
memory interface. I have a 7900GS clocked to 600/700 and still I dont play
many games at the native res of my LCD (1280x1024). Day of Defeat - I still
prefer 1024x768 with 4x AA and AF.

btw - my X2 3800 overclocks easy as pie from 10x200 to 10x240. I simply
downclock the RAM from DDR400 to DDR333 and set the HTT to 800 (or 4x) and
then raise the FSB. My mobo also allows me to run the PCI slots async
(locked at 33) so I only overclock the CPU and without any extra voltage or
cooling needed. You may want to try that with your 3600.
 
C

carrera d'olbani

*any* future game should have multi-core support built in - if it doesn't
then
that's just crap programming.


high resolutions like that are asking a lot of a 7600GT because its only
128bit
memory interface. I have a 7900GS clocked to 600/700 and still I dont play
many games at the native res of my LCD (1280x1024). Day of Defeat - I still
prefer 1024x768 with 4x AA and AF.

Yes, it is tru ethat 7600GT has a 128 bit only bus. However, it seems
to be seriously overclocked, and has many vertices and pipes, so it
can work on par with the 256 bit bus videocards such as ATI X1800GTO,
see e.g. http://www.digital-daily.com/video/msi_nx7600gt/index02.htm
btw - my X2 3800 overclocks easy as pie from 10x200 to 10x240. I simply
downclock the RAM from DDR400 to DDR333 and set the HTT to 800 (or 4x) and
then raise the FSB. My mobo also allows me to run the PCI slots async
(locked at 33) so I only overclock the CPU and without any extra voltage or
cooling needed. You may want to try that with your 3600.

Wow. I never done overclocking before. Such a wonderful world for me
to explore ahead of me. At the very moment, the speed of my CPU seems
to be OK for me. But in the future I might overclock it as a first
measure... and might buy another CPU as the second measure.
 
L

Lief

carrera d'olbani said:
Wow. I never done overclocking before. Such a wonderful world for me
to explore ahead of me. At the very moment, the speed of my CPU seems
to be OK for me. But in the future I might overclock it as a first
measure... and might buy another CPU as the second measure.

Overclocking is pointless these days.
 
S

spodosaurus

carrera said:
And why would that be ?

Perhaps because the percent increase in performance is much much smaller
compared to 8 or 9 years ago. Move on, it's a new millenium, spend the
extra dollars you would on cooling for a better chip in the first place.

Cheers,

Ari

--
spammage trappage: remove the underscores to reply
Many people around the world are waiting for a marrow transplant. Please
volunteer to be a marrow donor and literally save someone's life:
http://www.abmdr.org.au/
http://www.marrow.org/
 
S

Sleepy

spodosaurus said:
Perhaps because the percent increase in performance is much much smaller
compared to 8 or 9 years ago. Move on, it's a new millenium, spend the
extra dollars you would on cooling for a better chip in the first place.

Cheers,

Ari

As I said to Carrera in an earlier post I overclock my X2 3800 from 2.0ghz
to 2.4ghz with no extra cooling or cost. A 20% increase for free - how is
that pointless? Im not even pushing my PC either - plenty of ppl achieve
double that increase. My graphics card is a 7900GS - well know for its
overclockability. Mine runs at 600mhz core instead of the default 450 and
still doesnt get much above 50c under load.
 
S

spodosaurus

Sleepy said:
As I said to Carrera in an earlier post I overclock my X2 3800 from
2.0ghz to 2.4ghz with no extra cooling or cost. A 20% increase for free
- how is that pointless? Im not even pushing my PC either - plenty of
ppl achieve double that increase. My graphics card is a 7900GS - well
know for its overclockability. Mine runs at 600mhz core instead of the
default 450 and still doesnt get much above 50c under load.

And the benchmarks showed what degree of improvement in performance? :)

Ari


--
spammage trappage: remove the underscores to reply
Many people around the world are waiting for a marrow transplant. Please
volunteer to be a marrow donor and literally save someone's life:
http://www.abmdr.org.au/
http://www.marrow.org/
 
S

Sleepy

spodosaurus said:
And the benchmarks showed what degree of improvement in performance? :)

Ari

about a thousand points in 3dmark 2006 since you ask - not that I go by
synthetic benchmarks much. I do see a marked improvement playing Oblivion
and Stalker though and also DOD:Source which I play a fair bit online. The
Source engine is heavily CPU dependent too.
 
S

spodosaurus

Sleepy said:
about a thousand points in 3dmark 2006 since you ask - not that I go by
synthetic benchmarks much. I do see a marked improvement playing
Oblivion and Stalker though and also DOD:Source which I play a fair bit
online. The Source engine is heavily CPU dependent too.

Interesting. Perhaps it's time for me to review my information on OCing
current CPUs/RAM/Graphics.

Regards,

Ari

--
spammage trappage: remove the underscores to reply
Many people around the world are waiting for a marrow transplant. Please
volunteer to be a marrow donor and literally save someone's life:
http://www.abmdr.org.au/
http://www.marrow.org/
 
E

Ed Medlin

spodosaurus said:
Interesting. Perhaps it's time for me to review my information on
OCing current CPUs/RAM/Graphics.

Regards,

Ari
It might be that time Ari.....:). The Core 2 Duos and dual core AMDs
overclock extremely well with a proper MB with good OC'ing options using
just stock cooling. It is not unusual to see an Intel E6600 2.4ghz
overclock to over 3.0ghz with a stock Intel HS/Fan. The architecture of
these processors make for good overclocking without the extreme heat
produced previously. The increase in performance is easily seen in
gaming and in processor benchmarks.

Ed
 
C

carrera d'olbani

btw - my X2 3800 overclocks easy as pie from 10x200 to 10x240. I simply
downclock the RAM from DDR400 to DDR333 and set the HTT to 800 (or 4x) and
then raise the FSB. My mobo also allows me to run the PCI slots async
(locked at 33) so I only overclock the CPU and without any extra voltage or
cooling needed. You may want to try that with your 3600.

OK, finally today I did overclocking with my 3600+ according to you
receipy (I found this procedure on the Internet, too). Different from
your case, I could not find in my MSI K9N Neo-F motherboard the
setting which would allow to change the PCI/AGP bus. But an article
said that nowdays the motherboards are likely to have the frequencies
loced at the nominal values, so that I decided to risk.

Differently from your computer, my DDR was of a 667 MHz type (not 800
MHz). So that I went down to 533 MHz. My HTT turned out to be set at
800 (4x), so that I went down to 600 (3x) (in your case, you went down
from 1000 to 800). I raised FSB from 200 MHz to 240 MHz (just like you
did). Computer now seems to be more responsive, and loads the programs
like a bullet. The CPU temperature rose from 30 to 36 DC. I oveclocked
my nVidia 7600GT video card, too -- from 700 to 800 MHz. The
temperature rose from 45 to 52 DC. I am excited -- a rather
substantial overclocking, and the CPUs are cold !
 
S

Sleepy

carrera d'olbani said:
OK, finally today I did overclocking with my 3600+ according to you
receipy (I found this procedure on the Internet, too). Different from
your case, I could not find in my MSI K9N Neo-F motherboard the
setting which would allow to change the PCI/AGP bus. But an article
said that nowdays the motherboards are likely to have the frequencies
loced at the nominal values, so that I decided to risk.

Differently from your computer, my DDR was of a 667 MHz type (not 800
MHz). So that I went down to 533 MHz. My HTT turned out to be set at
800 (4x), so that I went down to 600 (3x) (in your case, you went down
from 1000 to 800). I raised FSB from 200 MHz to 240 MHz (just like you
did). Computer now seems to be more responsive, and loads the programs
like a bullet. The CPU temperature rose from 30 to 36 DC. I oveclocked
my nVidia 7600GT video card, too -- from 700 to 800 MHz. The
temperature rose from 45 to 52 DC. I am excited -- a rather
substantial overclocking, and the CPUs are cold !

good for you - you can use a free utility called CPU-Z (google for it) to
check your speeds.
Not just the CPU but also HTT and RAM. Keep an eye on temps - those are good
but if the graphics card creeps up to 60c you may want to improve the
cooling in your case. Give it a good workout - an hours worth of Oblivion or
Stalker should do. Report any problems. Paramount is to have a stable,
glitch free PC - free extra performance is just a bonus.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top