What should I look for in Core 2 Duo for significant speed jump over2.4 gig P4?

D

Dave C.

I've got a P4 2.4 gig machine now running XP, largely want to improve
rendering speed for video editing, how much processor do I need in the
Core 2 Duo realm to realize a healthy jump in speed? Not too proud to
go used, can't afford the latest greatest, but would like to see a
"significant" jump in speed.

What kind of numbers should I be looking for, and any suggestions on
pieces to look for or avoid?

Thanks for all input.

OK, for any CPU-intensive task, you want the fastest clock speed (GHz)
you can afford, paired with the most cache memory you can afford. Once
you figure that out, if you have extra money then you can add more
cores. Examples:
1) Core 2 Duo 3.0GHz 6MB cache (if that beast exists)
2) Core 2 Quad 2.4GHz 8MB cache (if that beast exists)
Number one would be significantly faster due to the higher clock speed.

But you asked about video editing. That is one specific application
where video card (speed), RAM (quantity) and hard drive (speed) all
play a part.

For video editing, look for:
64-bit OS with 6-8GB of RAM
7200RPM hard drive with 32MB of cache
A recent mid-range video card like a HD 4770 or similar with 1GB of
dedicated video RAM

Your system might have other glaring weaknesses that will slow down
video rendering more than a single core 2.4GHz CPU will. For example,
if you have 2GB of RAM and are using "integrated" (on the motherboard)
graphics adapter, then upgrading to a faster CPU will offer no
performance improvement at all. -Dave
 
D

Dave C.

Really? I wasn't aware the video card per se had much to do with speed
of editing. By rendering, I'm not talking about in the Video Game
sense, I mean processing video files with something like VirtualDub,
Vegas Movie Studio - processing effects, converting to DVD etc.

Depending on what software you are using, many video editing
applications will use the GPU about as much as they use the CPU. The
reason most people don't realize this is, the typical GPU is much more
powerful than a CPU. So while the GPU is (relatively speaking)
coasting, the CPU gets HAMMERED.

If you are doing any kind of video editing, you need (at minimum) a
pretty powerful mid-range graphics card with lots of dedicated video
RAM.

On a side note, the days of the CPU are numbered. GPUs are getting so
powerful, they will soon take over all the CPU functions. -Dave
 
M

muzician21

I've got a P4 2.4 gig machine now running XP, largely want to improve
rendering speed for video editing, how much processor do I need in the
Core 2 Duo realm to realize a healthy jump in speed? Not too proud to
go used, can't afford the latest greatest, but would like to see a
"significant" jump in speed.

What kind of numbers should I be looking for, and any suggestions on
pieces to look for or avoid?

Thanks for all input.
 
M

muzician21

But you asked about video editing.  That is one specific application
where video card (speed), RAM (quantity) and hard drive (speed) all
play a part.  


Really? I wasn't aware the video card per se had much to do with speed
of editing. By rendering, I'm not talking about in the Video Game
sense, I mean processing video files with something like VirtualDub,
Vegas Movie Studio - processing effects, converting to DVD etc.
 
D

Dave C.

* SteveH:


How is a CPU a PSU?

Benjamin

BTW: an overszed PSU is in fact a waste of energy as most switching
PSUs are working less efficient when loaded way below their maximum
rating.

If your PSU is junk quality, that is correct. -Dave
 
D

Dave C.

This really isn't true.

GPUs are very good at performing a very small number of tasks, and for
those specific tasks, no modern general purpose CPU can compete.

General purpose CPUs' days may be numbered, but it will take more than
GPUs to replace them.

What you fail to realize is that it would be trivial to design a single
chip to perform both functions. We already have quad-core chips being
the de-facto standard. How long do you think it would take AMD/ATI
(for example) to integrate a CPU into a GPU fab?

It's coming. Now you know why AMD bought ATI. They needed to. -Dave
 
B

Benjamin Gawert

* muzician21:
I've got a P4 2.4 gig machine now running XP, largely want to improve
rendering speed for video editing, how much processor do I need in the
Core 2 Duo realm to realize a healthy jump in speed?

Well, except the intel Atom and similar low-power CPUs of course you
probably will have a hard time finding a somewhat newer CPU that doesn't
give you a very noticable performance improvent.
Not too proud to
go used, can't afford the latest greatest, but would like to see a
"significant" jump in speed.

Well, a P4 2.4GHz sounds like Socket428 to me, and honestly, everything
you can get for this socket is old and slow. However, if your mobo is
already Socket LGA775 then it depends on the mobo. Older LGA775 boards
are limited to the Pentium D (basically a dual core Pentium4) which will
already give you a noticable performance boost (get a Pentium D 9xx not
the old 8xx series, though!).

However, if your board supports it then the Core 2 Duo/Quad is the way
to go.

Benjamin
 
B

Benjamin Gawert

* Dave C.:
OK, for any CPU-intensive task, you want the fastest clock speed (GHz)
you can afford, paired with the most cache memory you can afford.

Well, usually a good advice is to buy enough CPU power appropriate to
the task. Going above that is a waste of money and energy.
Once
you figure that out, if you have extra money then you can add more
cores. Examples:
1) Core 2 Duo 3.0GHz 6MB cache (if that beast exists)
2) Core 2 Quad 2.4GHz 8MB cache (if that beast exists)
Number one would be significantly faster due to the higher clock speed.

This is only true for single threaded applications. With multithreaded
programs #2 runs circles around #1. And figuring out how many cores can
be used by the software should not be secondary in the list, it should
be the first thing that is checked.
But you asked about video editing. That is one specific application
where video card (speed), RAM (quantity) and hard drive (speed) all
play a part.

This is not quite right. First, if the video editing software is no
64bit application then buying more than 4GB RAM is a waste of money for
that task. Second, only very few editing programs actually support GPGPU
functionality, and those that do use it only rudimentary.
For video editing, look for:
64-bit OS with 6-8GB of RAM

While I agree that when buying a new OS it makes sense going for the
64bit version, there is no reason why the needs to buy a new OS just
because he wants a faster system.
7200RPM hard drive with 32MB of cache
A recent mid-range video card like a HD 4770 or similar with 1GB of
dedicated video RAM

As most video editing programs that support GPGPU still set on NVIDIA's
CUDA recommending a midrange ATI card isn't a good recommendation. Any
low-end card (Radeon 4200/300 series, Geforce GT120 etc) does as well as
a Radeon 4770, is cheaper and consumes less power.
Your system might have other glaring weaknesses that will slow down
video rendering more than a single core 2.4GHz CPU will. For example,
if you have 2GB of RAM and are using "integrated" (on the motherboard)
graphics adapter, then upgrading to a faster CPU will offer no
performance improvement at all.

This is complete nonsense. Every integrated gfx solution benefits from a
faster processor as it offloads many things onto the CPU. In fact, for
any video editing application that does not have GPGPU support an
integrated graphics solution does as fine as any separate graphics card.

Following your recommendations, the OP would spend a shitload of money
for something which brings very low or even no return in value.

Benjamin
 
S

SteveH

Benjamin said:
* Dave C.:


Well, usually a good advice is to buy enough CPU power appropriate to
the task. Going above that is a waste of money and energy.
How is getting a larger PSU a waste of energy?
 
B

Benjamin Gawert

* SteveH:
How is getting a larger PSU a waste of energy?

How is a CPU a PSU?

Benjamin

BTW: an overszed PSU is in fact a waste of energy as most switching PSUs
are working less efficient when loaded way below their maximum rating.
 
B

Benjamin Gawert

* Steve:
I don't even see it as a waste of power with modern cpu's only
running hard enough to meet computation demands. Properly set up
cpus throttle back when idle.

This would only be right if you completely ignore that between different
CPUs there are quite noticable differences in power consumption even
when they throttle back.

Benjamin
 
D

DevilsPGD

In message <[email protected]> "Dave C."
On a side note, the days of the CPU are numbered. GPUs are getting so
powerful, they will soon take over all the CPU functions

This really isn't true.

GPUs are very good at performing a very small number of tasks, and for
those specific tasks, no modern general purpose CPU can compete.

General purpose CPUs' days may be numbered, but it will take more than
GPUs to replace them.
 
D

Dave C.

CPU and GPU on the same die isn't particularly difficult to
accomplish, but it's not a GPU taking over for a CPU, it's a CPU+GPU
on one die.

It depends on how you look at it. If 90% or more of a single chip is
dedicated to graphics processing and the rest is performing CPU
functions while still assisting in graphics processing, do you call it
a CPU?

Within 10 years or so (probably) the CPU will be (at most) a trivial
specification buried in the microscopic print of the list of
specifications that describe the GPU you buy. Kind of like the
ethernet adapter built into a mainboard. Is it important? Sure. Is
it a primary thing you look for when buying a motherboard? Well some
people do. Most just take it for granted. That's what the CPU will be
soon. Just a (yawn) small part of a GPU. -Dave
 
D

DevilsPGD

In message <[email protected]> "Dave C."
What you fail to realize is that it would be trivial to design a single
chip to perform both functions. We already have quad-core chips being
the de-facto standard. How long do you think it would take AMD/ATI
(for example) to integrate a CPU into a GPU fab?

It's coming. Now you know why AMD bought ATI. They needed to. -Dave

CPU and GPU on the same die isn't particularly difficult to accomplish,
but it's not a GPU taking over for a CPU, it's a CPU+GPU on one die.
 
D

Dave C.

Maybe, maybe not. In the 90s, Intel said computers, by the turn of
the century, would use bio-chips and touted all kinds of fantastic
speeds but all they did was run the 90s technology down a rabbit hole.

What the experts are expecting is nothing but a merging of existing
technology. This is not vaporware we are talking about, just a logical
course for current technology to develop.

. . . so, any 'plans' or guesswork from pundits about where Intel
will go with technology is a 'yawn' until something is built and/or
sold, IMHO.

--g

Who said anything about Intel? I half expect Intel to go the way of
cyrix, unless they buy nvidia. Note I have nothing against Intel. I
just don't see how they (Intel) are going to become a GPU developer to
compete with the likes of nvidia and AMD/ATI in time to compete in the
marketplace that is developing.

Intel makes fantastic quality CPUs. What they need to make is
fantastic quality GPUs. Or they can buy nvidia. Or they can cease
to exist. -Dave
 
D

DevilsPGD

In message <[email protected]> "Dave C."
It depends on how you look at it. If 90% or more of a single chip is
dedicated to graphics processing and the rest is performing CPU
functions while still assisting in graphics processing, do you call it
a CPU?

If a GPU is performing primary computing functions, taking over for what
we currently call CPUs, do you still call it a GPU?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top