Got my Conroe system going!

C

chrisv

Dual Core Power!!!

Core2 Duo E6400 @ 2.66GHz (1333Mhz FSB)
Intel 975XBX "BadAxe" mobo
2GB Crucial DDR2-667
EVGA "Superclock" (560Mhz) Nvidia 7900GT 512MB

If you did a doubletake at the above CPU numbers, yes, I'm
overclocking what is normally a 2.13GHz CPU. I haven't overclocked a
PC since the Celeron 300A machine that I built back in the 90's, but
with Intel supporting 1333MHz FSB on the BadAxe, and reports that
overclocking the Core2 Duo is pretty-much a slam-dunk, it just seemed
too easy, and too fun 8), to resist.

Seems to be working fine so far. "Stress-testing" was done playing
FEAR. After reading about people OC'ing these things to 4Ghz, I think
my mild 25% OC is very likely to continue to work fine. More bang for
the buck! 8)

In normal usage, it doesn't seem any different from my old Northwood
3.0, which was already plenty snappy for most tasks. One reason for
the upgrade was so that my kid can get the Northwood box (6600GT
video), as her P3-1Ghz box was getting kind of long in the tooth.
Plus, even though I don't game very often, it's fun to do occasionally
and my new machine can now handle pretty much anything out there at
high resolutions and high quality settings. The graphics in FEAR are
pretty darn impressive...

So, even though I didn't "need" to upgrade, it seemed like the right
time to do it. We use the hell out of our computers, so it's not a
bad place to spend some money...
 
R

Rthoreau

chrisv said:
Dual Core Power!!!

Core2 Duo E6400 @ 2.66GHz (1333Mhz FSB)
Intel 975XBX "BadAxe" mobo
2GB Crucial DDR2-667
EVGA "Superclock" (560Mhz) Nvidia 7900GT 512MB

If you did a doubletake at the above CPU numbers, yes, I'm
overclocking what is normally a 2.13GHz CPU. I haven't overclocked a
PC since the Celeron 300A machine that I built back in the 90's, but
with Intel supporting 1333MHz FSB on the BadAxe, and reports that
overclocking the Core2 Duo is pretty-much a slam-dunk, it just seemed
too easy, and too fun 8), to resist.

Seems to be working fine so far. "Stress-testing" was done playing
FEAR. After reading about people OC'ing these things to 4Ghz, I think
my mild 25% OC is very likely to continue to work fine. More bang for
the buck! 8)

In normal usage, it doesn't seem any different from my old Northwood
3.0, which was already plenty snappy for most tasks. One reason for
the upgrade was so that my kid can get the Northwood box (6600GT
video), as her P3-1Ghz box was getting kind of long in the tooth.
Plus, even though I don't game very often, it's fun to do occasionally
and my new machine can now handle pretty much anything out there at
high resolutions and high quality settings. The graphics in FEAR are
pretty darn impressive...

So, even though I didn't "need" to upgrade, it seemed like the right
time to do it. We use the hell out of our computers, so it's not a
bad place to spend some money...

I thought you had an AMD system at one point, or was that just an
acknowledgement of the technology at the time? Also you fail to
mention what OS you are using? I would love someone to try give a hands
on report of how the 64 bit extensions work in a real world situation.
As I have heard various things in the media and how some features are
not yet implemented.

I am also surprised you went with a 7900 GT, doesn't that gpu have a
history of problems. I would watch that with an eagle eye as that could
be problematic. Also why not go with a nvidia chipset? Also did your
motherboard have any markings that indicate who manufactured it, such
as Foxconn, also what brand of caps does it use?

Rthoreau
 
C

chrisv

Rthoreau said:
I thought you had an AMD system at one point, or was that just an
acknowledgement of the technology at the time?

I built an AMD64 machine for my brother, and came close to getting one
for myself a couple months ago. Then Conroe came along... 8)
Also you fail to
mention what OS you are using? I would love someone to try give a hands
on report of how the 64 bit extensions work in a real world situation.
As I have heard various things in the media and how some features are
not yet implemented.

I'm going to Dual-boot XP and Mepis Linux, but so far only XP is on
the HD. I hadn't given much thought to going with a 64-bit OS...
I am also surprised you went with a 7900 GT, doesn't that gpu have a
history of problems. I would watch that with an eagle eye as that could
be problematic.

From what I understand, the issue was with the Samsung memory that
many companies were using. To their credit, EVGA acknowledges the
problems and now offers what they call their "reload" cards, which
have different memory chips and should not have any problems.

http://www.evga.com/articles/317.asp
Also why not go with a nvidia chipset?

I don't think there's any Nvidia-based mobos that support the Conroe
(yet). The Intel board is a good one, albeit pricey. (I got my money
back from them by buying a cheaper chip and overclocking it. 8)
Also did your
motherboard have any markings that indicate who manufactured it, such
as Foxconn, also what brand of caps does it use?

I didn't examine it for markings, but I doubt it will say anything but
"Intel" on it... As for the caps, I'll look at them and report back,
if I don't forget.
 
Y

Yousuf Khan

chrisv said:
Dual Core Power!!!

Core2 Duo E6400 @ 2.66GHz (1333Mhz FSB)
Intel 975XBX "BadAxe" mobo
2GB Crucial DDR2-667
EVGA "Superclock" (560Mhz) Nvidia 7900GT 512MB

Is that with the 2MB or 4MB cache?

Yousuf Khan
 
T

The Kat

Is that with the 2MB or 4MB cache?

The e6300 and 6400 have 2x1 meg,
the e6600, 6700, and 6800 have the 2x2 meg cache.


Lumber Cartel (tinlc) #2063. Spam this account at your own risk.

This sig censored by the Office of Home and Land Insecurity...

Remove XYZ to email me
 
G

George Macdonald

Rthoreau wrote:


I don't think there's any Nvidia-based mobos that support the Conroe
(yet). The Intel board is a good one, albeit pricey. (I got my money
back from them by buying a cheaper chip and overclocking it. 8)

Yes, Asus has a couple of Conroe-compatible nVidia mbrds, one an upgrade of
a nForce4-based board, the P5N32-SLI SE Deluxe... the other a 570-based
system. Apparently the 590-based board is err, imminent.
 
C

chrisv

Yousuf said:
Is that with the 2MB or 4MB cache?

As "The Kat" said, the E6400 has "only" 2MB L2 cache, although I
object to his description of it as "2 x 1 Meg" when AFAIK it's really
one shared cache.

For awhile I was thinking that the E6600, which runs at a nominal
2.4GHz and has 4MB L2, would be the one to get, but at the last minute
I changed my mind. The E6600 wasn't readily available, and the $360
price was a bit high. I read reports that showed that the benefit of
the larger cache was a few percent at best, and reports of the easy
over-clockability, especially of the "slower" variants. So I thought,
"what the heck, get the cheaper one and OC it PAST the 2.4GHz that I
was planning on getting." Cool!

Plus, the 8xFSB core multiplier gives a nice symmetry to things, with
the memory in my memory running dual channels at 333MHz DDR, perfectly
matching the FSB running at 333MHz QDR ("1333MHz" FSB), and the CPUs
running twice that at 2.66MHz. Geeky. 8)

Heck, for all I know, many of the "2.13GHz" E6400's run just fine at
much higher frequencies, but had a test failure in the L2 so that they
had to switch half of it off, and then for marketing reasons (not
wanting two products, 2M and 4M, at each frequency) they just sell
those as their "low end" chips. After all, with the huge L2, the odds
are 50-50 that any defect in the chip will be somewhere in the L2.
 
W

willbill

chrisv said:
Dual Core Power!!!

Core2 Duo E6400 @ 2.66GHz (1333Mhz FSB)
Intel 975XBX "BadAxe" mobo
2GB Crucial DDR2-667
EVGA "Superclock" (560Mhz) Nvidia 7900GT 512MB

If you did a doubletake at the above CPU numbers, yes, I'm
overclocking what is normally a 2.13GHz CPU.

<various snips>


how is it on output of heat?

also IIRC, that 7900GT is a current mid
to high-end video board, right?

meaning that it also puts out a lot
of heat, right?

Seems to be working fine so far. "Stress-testing"
was done playing FEAR.


how long have you been running this
new machine for? (to the closest week)


Plus, even though I don't game very often,
it's fun to do occasionally

^^^^^^^^^^^^^^^^^^^^^^^^

agreed

i'm currently running Oblivion on
XP with a single Opty 248 2.2 GHz
(socket 940, single core) with
a 7600GT/256MB

runs pretty decent, and is suprising
(to me) in how stable it is (both XP
and Oblivion), and in how, when it
(Oblivion) does fold up, that i can
almost always restart without having
to reboot XP. :)

and my new machine can now handle pretty much anything out there at
high resolutions and high quality settings. The graphics in FEAR are
pretty darn impressive...


the 3D graphics are the one thing
that consistently gets better. :)

glancing at the c.s.i.p.g.action n/g,
FEAR appears to be a shooter?

how does it's graphics compare to Oblivion?

So, even though I didn't "need" to upgrade,
it seemed like the right time to do it.


correct me if i'm wrong; IIRC each of Intel's past
major CPU transitions weren't without problems

nothing like being a guinea pig with
new technology (whether it's h/w or s/w)

all ears. :)

bill
 
C

chrisv

willbill said:
how is it on output of heat?

The CPU seems to be running cool and quiet with the stock
heatsink/fan.
also IIRC, that 7900GT is a current mid
to high-end video board, right?

meaning that it also puts out a lot
of heat, right?

A respectable amount, I suppose, but not bad for "high end" card.
Nvidia's new 90nm GPU's are known to be pretty reasonable on the power
consumption.

On a related note, if you want a 7900GT but don't want to be able to
hear the cooling fan from the next room, plan on getting an
aftermarket cooler. I use a Zalman VF900-Cu, which is not only
virtually silent, but is a MUCH better cooler.
how long have you been running this
new machine for? (to the closest week)

1.5 weeks. 8)
(snip)

glancing at the c.s.i.p.g.action n/g,
FEAR appears to be a shooter?

Suspense/shooter, I'd call it. I like it.
how does it's graphics compare to Oblivion?

I've not tried Oblivion. As far as I know, it's the one game that
even a high-end card can't do well, unless you have two cards in SLI,
and I'm not gong there...

I should note that, even on my fairly respectable new rig, FEAR can
get a bit jerky during intense battles. Some of these new games are
VERY demanding.
correct me if i'm wrong; IIRC each of Intel's past
major CPU transitions weren't without problems

nothing like being a guinea pig with
new technology (whether it's h/w or s/w)

all ears. :)

I think these last issues deserve their own thread. 8)
 
C

chrisv

willbill said:
correct me if i'm wrong; IIRC each of Intel's past
major CPU transitions weren't without problems

nothing like being a guinea pig with
new technology (whether it's h/w or s/w)

Well, Intel clearly went down a technological dead-end with their
"Netburst" architecture, with it's design goal of "performance via
high clock rate". I think that everyone agrees that their more-recent
designs, from the Pentium M up to the Core 2 Duo, are designed much
more intelligently. I think there's a couple points to be made beyond
the obvious "they were getting their butts kicked and needed to do
something".

1) Netburst just had to be the result of Intel marketing's demand to
have the highest GHz numbers. I can't believe that they thought that
it was really the optimal engineering solution, especially when power
requirements are factored-in.

It seems the world is now over this "faster clock = better" nonsense.

2) In the past, a new Intel CPU architecture was expected to last
three process generations, and, because of that, it seemed that their
new architectures didn't really "hit their stride" until the second,
die-shrunk generation. This resulted in first-gen products that were
hot running and mediocre in performance. With the Netburst CPU's the
third generation proved to be a bust as well (which was their wake-up
call).

It seems that Intel has now accepted that, in order to be competitive,
they need to redesign more often, so that their new designs work great
right from the start and so that they are not stuck with old designs
that are past their use-by date.
 
C

chrisv

willbill said:
correct me if i'm wrong; IIRC each of Intel's past
major CPU transitions weren't without problems

nothing like being a guinea pig with
new technology (whether it's h/w or s/w)

Well, Intel clearly went down a technological dead-end with their
"Netburst" architecture, with it's design goal of "performance via
high clock rate". I think that everyone agrees that their more-recent
designs, from the Pentium M up to the Core 2 Duo, are designed much
more intelligently. I think there's a couple points to be made beyond
the obvious "they were getting their butts kicked and needed to do
something".

1) Netburst just had to be the result of Intel marketing's demand to
have the highest GHz numbers. I can't believe that they thought that
it was really the optimal engineering solution, especially when power
requirements are factored-in.

It seems the world is now over this "faster clock = better" nonsense.

2) In the past, a new Intel CPU architecture was expected to last
three process generations, and, because of that, it seemed that their
new architectures didn't really "hit their stride" until the second,
die-shrunk generation. This resulted in first-gen products that were
hot running and mediocre in performance. With the Netburst CPU's the
third generation proved to be a bust as well (which was their wake-up
call).

It seems that Intel has now accepted that, in order to be competitive,
they need to redesign more often, so that their new designs work great
right from the start and so that they are not stuck with old designs
that are past their use-by date.
 
W

willbill

chrisv said:
willbill wrote:




The CPU seems to be running cool and quiet with the stock
heatsink/fan.




A respectable amount, I suppose, but not bad for "high end" card.
Nvidia's new 90nm GPU's are known to be pretty reasonable on the power
consumption.

i only have a 7600GT and i'm more than a little
surprised at how much heat comes off it

have you put your hand on the outside
of the metal case where your video card is?

is it hot?

if yes, you have a hot spot

On a related note, if you want a 7900GT but don't want to be able to
hear the cooling fan from the next room, plan on getting an
aftermarket cooler. I use a Zalman VF900-Cu, which is not only
virtually silent, but is a MUCH better cooler.


i've looked at them but felt i could do
without one (whether Zalman or other)

i at 1st had a hot spot with it,
but since i keep positive pressure
in the case, i removed the slot plate
immediately below the video board and
that resolved it

1.5 weeks. 8)


that's what i thought <grin>

give it 3+ months. :)

Suspense/shooter, I'd call it. I like it.




I've not tried Oblivion. As far as I know, it's the one game that
even a high-end card can't do well, unless you have two cards in SLI,
and I'm not gong there...

I should note that, even on my fairly respectable new rig, FEAR can
get a bit jerky during intense battles. Some of these new games are
VERY demanding.




I think these last issues deserve their own thread. 8)


i see the new thread title. i may well
choose to stay out of that discussion. :O)

btw, pretty good response to some mildly
agressive questions. :)

have you had enough time yet to figure
out how much benefit you'll see with
a dual core CPU?

do another thread title if you choose

bill
 
W

willbill

chrisv said:
willbill wrote:




Well, Intel clearly went down a technological dead-end with their
"Netburst" architecture, with it's design goal of "performance via
high clock rate". I think that everyone agrees that their more-recent
designs, from the Pentium M up to the Core 2 Duo, are designed much
more intelligently. I think there's a couple points to be made beyond
the obvious "they were getting their butts kicked and needed to do
something".


i'm not all that sure what Netburst includes

were the Northwood CPUs Netburst?

and do you really think that Intel's Prescott (both
early and late, both of which i presume are Netburst)
is a major CPU transition?

i could be wrong but i don't see Prescott as
a major CPU transition

1) Netburst just had to be the result of Intel marketing's demand to
have the highest GHz numbers. I can't believe that they thought that
it was really the optimal engineering solution, especially when power
requirements are factored-in.

It seems the world is now over this "faster clock = better" nonsense.


i rather doubt that

2) In the past, a new Intel CPU architecture was expected to last
three process generations, and, because of that, it seemed that their
new architectures didn't really "hit their stride" until the second,
die-shrunk generation. This resulted in first-gen products that were
hot running and mediocre in performance. With the Netburst CPU's the
third generation proved to be a bust as well (which was their wake-up
call).

It seems that Intel has now accepted that, in order to be competitive,
they need to redesign more often,


that's been true for every industry
this past two or three decades

what makes you think that Intel
hasn't seen that?

bill
 
C

chrisv

willbill said:
i only have a 7600GT and i'm more than a little
surprised at how much heat comes off it

have you put your hand on the outside
of the metal case where your video card is?

No, in fact the cover is not on the case yet. I put my hand on the
backside of the card itself. Yes, during gameplay it did get rather
"hot" with the stock cooler. WAY cooler with the Zalman. The stock
cooling for the memory chips was especially lame - their contact to
the heatsink was through a pad that was a good 1mm thick and looked
like foam rubber. The Zalman system uses individual memory heatsinks
that stick to the chips with a very thin layer of thermal adhesive.
i've looked at them but felt i could do
without one (whether Zalman or other)

i at 1st had a hot spot with it,
but since i keep positive pressure
in the case, i removed the slot plate
immediately below the video board and
that resolved it

My Antec Sonata II case has a funky air-duct thingy that hangs-over
the CPU and video card so that heat can be directly ducted out of the
case. I'm not using it now, though, and I don't think I will.
that's what i thought <grin>

give it 3+ months. :)

Piece of cake. 8)
(snip)

have you had enough time yet to figure
out how much benefit you'll see with
a dual core CPU?

No, not really. It just seems the way of the future, with supposed
benefits in "system responsiveness". No app that I use needs two
CPU's... Future games could, but I'm not sure what the point would be
since 99% of us are GPU-limited anyway...
 
C

chrisv

willbill said:
i'm not all that sure what Netburst includes
http://www.sandpile.org/impl/p4.htm

were the Northwood CPUs Netburst?

Yes, the second-gen. The only good one, IMO.
and do you really think that Intel's Prescott (both
early and late, both of which i presume are Netburst)
is a major CPU transition?

i could be wrong but i don't see Prescott as
a major CPU transition

Major or minor, my point remains the same.
that's been true for every industry
this past two or three decades

what makes you think that Intel
hasn't seen that?

I don't understand the question. My point was that they do see it
now, but didn't before.
 
W

willbill

chrisv said:


wow, nice summary of the entire P-4 line,
including Celeron and Xeon. :)

so Intel's Pentium 4 is what you are
calling "Netburst"?

btw, near the top is: Family/Generation
80786, 7th Generation, MMX, SSE, SSE2, SSE3 (0.09 µm)
^^^^^^^^^^^^^^^^^

it is these "generation" changes that have
a greater chance of having changeover problems;
but i don't know which of them (i.e. Intel's)
have had more problems than the others

i'd certainly call the move from P-III
to P-4 a major CPU change. meaning one
that has a bigger chance of problems for
early adapters

Prescott was a minor change. they run ok.
hot, but otherwise ok. i also note
(from the list) that Prescott was 90nm
and the current 65nm CPUs are Presler

what i've never seen is an after the
fact summary of just how rough the
change over actually was, for the
various generation changes

i'd also call the move (from 486)
to Pentium a major CPU change

i'm less certain about which of the
other changes might qualify as
major CPU changes, nor which of
them was initially more problematic
(for early users)

AMD has had it's own list of
major CPU changes. :)


Yes, the second-gen. The only good one, IMO.


i also liked the Northwoods.
fwiw, i bought a Northwood. :)

i built two machines at that time:
a DFI mobo with the Northwood (and
Intel's high end desktop chipset that
permitted ECC), and my old Tyan S2875
(which i'm using to type this) with an
AMD Opty 142 and AMD chipset. my sister
expressed an interest in a computer
and i gave here the choice of either
of them. of course, she went with
the "Intel" name. :)

btw, DFI makes nice boards!

anyhow, i got what i want: a server
mobo where i've got high confidence
that the ECC memory on it really works
(i.e. corrects/reports any memory errors)

i don't have that confidence with
desktop machines. something that
slowly dawned on me after i got done
building the two machines

the only other option of that time
was an even more expensive Intel Xeon
of that time, which i did NOT like
the looks of
Major or minor, my point remains the same.


my vote goes for minor. :)

just for the record, kindly state
what you think the problem(s) are
with Prescott/Presler

afaik, the main problem with
Prescott/Presler is the excessive
heat generated, which caused Intel to
fall behind in the performance race

afaik, Prescott/Presler did *not* cause
any significant changeover problems
I don't understand the question. My point was that they do see it
now, but didn't before.


for sure Intel sees it *now* :)

here's to real competition coz
we all come out ahead from it. :)

fwiw, i hope you don't run into any
problems with your new dual core
Intel Conroe CPU

but you don't know that yet

and if you do run into problems down
the line, you also don't know how much
of a nosebleed it will be

otoh, looks like you are off to a good start. :)

bill
 
C

chrisv

willbill said:
wow, nice summary of the entire P-4 line,
including Celeron and Xeon. :)

Sandpile is a great reference for all things x86.
so Intel's Pentium 4 is what you are
calling "Netburst"?
Yes.

just for the record, kindly state
what you think the problem(s) are
with Prescott/Presler

Bottom line, they didn't perform well. Whatever the reasons (e.g.
heat), they didn't perform well.
afaik, the main problem with
Prescott/Presler is the excessive
heat generated, which caused Intel to
fall behind in the performance race

afaik, Prescott/Presler did *not* cause
any significant changeover problems

Depends what you consider a "problem". Sure, they "work", but the
performance was lame, and the design fell short of it's goals. Intel
did not add all those pipeline stages so that they could go from
3.4GHz to 3.6GHz, with a newer manufacturing process, even!

Only Intel's size and market muscle saved them from getting laughed
out of the market during the Prescott years.
fwiw, i hope you don't run into any
problems with your new dual core
Intel Conroe CPU

but you don't know that yet

and if you do run into problems down
the line, you also don't know how much
of a nosebleed it will be

otoh, looks like you are off to a good start. :)

I'm not worried at all. The 975 chipset/motherboard platform is
mature (which one could argue is more important than the "maturity" of
the CPU), and there's no reason to believe the new CPU is "defective"
in any way.
 
N

nobody

Only Intel's size and market muscle saved them from getting laughed
out of the market during the Prescott years.

No, IMO it was the sheer size of the market itself that saved Intel's
a$$. AMD, producing flat out, could fill only as much of the market
as their market share was, and there always were shortages of at least
some SKU here and there. There was no way for AMD to add more
capacity. After all, somebody had to make all the Celerons for $299
Dell boxes ;-))))))))))))))))

NNN
 
W

willbill

chrisv said:
willbill wrote:

No, in fact the cover is not on the case yet. I put my hand on the
backside of the card itself. Yes, during gameplay it did get rather
"hot" with the stock cooler. WAY cooler with the Zalman. The stock
cooling for the memory chips was especially lame - their contact to
the heatsink was through a pad that was a good 1mm thick and looked
like foam rubber. The Zalman system uses individual memory heatsinks
that stick to the chips with a very thin layer of thermal adhesive.


a case hot spot has little to do with
how good/bad the GPU/memory cooling is

hot spots are driven by how good/bad
your air flow is within the case

My Antec Sonata II case has a funky air-duct thingy that hangs-over
the CPU and video card so that heat can be directly ducted out of the
case. I'm not using it now, though, and I don't think I will.


given how much heat comes off your 7900GT,
my hunch is that once you put the cover
on the case, that you'll have a hot spot
where the video card is

assuming i'm right, do me a favor and post
about what you do to solve it

for me, in addition to removing a rear slot
plate, i also bought a better, speed controlled,
120mm fan to push more air into the case

bill
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top