Intel 'Larrabee' GPU: 16 Cores - 2GHz - 150W - Nvidia Partnership(?)

A

AirRaid

http://www.beyond3d.com/content/news/242

Larrabee: 16 Cores, 2GHz, 150W, and more...
Friday 01st June 2007, 06:08:45 PM, written by Arun


It is amazing how much information is out there in the wild, when you
know where to look. TG Daily has just published an article partially
based on a presentation they were tipped off about, and which was
uploaded on the 26th of April. It reveals a substantial amount of new
information, which we will not focus on analysing right now, so we do
encourage you to read it for yourself.

Page 1 discusses the possibility that Larrabee is a joint effort
between NVIDIA and Intel, which we find unlikely, and is possibly just
a misinterpretation of the recently announced patent licensing
agreement between the two companies. Page 2 is much more interesting
however, as they link to the presentation above and also uncover the
hidden Larrabee PCB diagram on slide 16.

We would tend not to agree with most of the analysis and speculation
provided by TG Daily, but it's still worth a good read along with the
presentation, which we are very glad they uncovered. Especially
interesting are slides 16, 17, 19, 24 and 31. That last one includes
some very interesting and previously unknown information on Intel's
upcoming Gesher CPU architecture (aka Sandy Bridge), which is aimed at
the 32nm node in the 2010 timeframe. Larrabee, on the other hand, will
presumably be manufactured on Intel's 45nm process but sport a larger
die size.



____________


http://www.tgdaily.com/content/view/32282/137/

Intel set to announce graphics partnership with Nvidia?

By Wolfgang Gruener, Darren Polkowski
Friday, June 01, 2007 01:26

Intel set to announce graphics partnership with Nvidia?

Chicago (IL) - Intel may soon be announcing a close relationship with
Nvidia, which apparently will be contributing to the company's
Larrabee project, TG Daily has learned. Larrabee is expected to roll
out in 2009 and debut as a floating point accelerator product with a
performance of more than 1 TFlops as well as a high-end graphics card
with dual-graphics capabilities.

Rumors about Intel's Larrabee processor have been floating around for
more than a year. Especially since the product's official announcement
at this year's spring IDF and an accelerating interest in floating
point accelerators, the topic itself and surrounding rumors are
gaining traction every day.

Industry sources told TG Daily that Intel is preparing a "big"
announcement involving technologies that will be key to develop
Larrabee. And at least some of those technologies may actually be
coming from Nvidia, we hear: Our sources described Larrabee as a
"joint effort" between the two companies, which may expand over time.
A scenario in which Intel may work with Nvidia to develop Intel-
tailored discrete graphics solutions is speculation but is considered
to be a likely relationship between the two companies down the road.
Clearly, Intel and Nvidia are thinking well beyond their cross-
licensing agreements that are in place today.

It is unclear when the collaboration will be announced; however,
details could surface as early as June 26, when the International
Supercomputing Conference 2007 will open its doors in Dresden,
Germany.

Asked about a possible announcement with Intel, Nvidia spokesperson
Ken Brown provided us with a brief statement: "We enjoy a good working
relationship with Intel and have agreements and ongoing engineering
activities as a result. This said, we cannot comment further about
items that are covered by confidentiality agreements between Intel and
Nvidia."

Intel replied to our inquiry by saying that the company does "not
comment on rumors and speculation."



The AMD-ATI and Intel-Nvidia thingy

In the light of the AMD-ATI merger, it is only to be expected that the
relationship between Intel and Nvidia is examined on an ongoing basis.
So, what does a closer relationship between Intel and Nvidia mean?

The combination with ATI enabled AMD to grow into a different class of
company. It evolved from being CPU-focused into a platform company
that not only can match some key technologies of Intel, but at least
for now has an edge in areas such as visualization capabilities. At a
recent press briefing, the company showed off some of its ideas and it
was clear to us that especially the area of general purpose GPUs will
pave the way to a whole new world of enterprise and desktop
computing.

Nvidia is taking a similar approach with its CUDA software interface,
which allows developers to take advantage of the (general purpose)
floating point horsepower of Geforce 8 graphics processors - more than
500 GFlops per chip. Intel's Larrabee processor is also aimed at
applications that benefit from floating point acceleration - such as
physics, enhanced AI and ray tracing.

While it has been speculated that Intel may be creating Larrabee with
an IA CPU architecture, we were told there may be more GPU elements in
this processor than we previously had thought. A Larrabee card with a
(general purpose) graphics processing unit will support CPUs in
applications that at least partially benefit from massively parallel
processing (as opposed to the traditional sequential processing); in
gaming, the Larrabee processor can be used for physics processing, for
example.

An imminent collaboration announcement between Intel and Nvidia, which
reminds us of a recent Digitimes story that claimed Nvidia was trading
technologies with Intel, of course, raises the question how close the
relationship between Intel and Nvidia might be. It also raises the
question, once again, if Intel may actually be interested in buying
Nvidia - which could make a whole lot of sense for Intel, but appears
to be rather unlikely at this time. Nvidia could cost Intel more than
$15 billion, given the firm's current market cap of $12.6 billion, and
the talk in Silicon Valley indicates that Nvidia co-founder and CEO
Jen-Hsun Huang isn't really interested in selling the company.

But a deal with Intel, involving the licensing of technologies or even
supply of GPUs could have a huge impact on Nvidia's bottom line and
catapult the company into a new phase of growth. However, a closer
collaboration could be important for Intel as well: AMD's acquisition
of ATI was not a measure to raise the stakes in the graphics market or
to battle Nvidia; it was a move to compete in the future CPU market -
with Intel. Having Nvidia on board provides Intel with a graphics
advantage, at least from today's point of view, and could allow the
company to more easily access advanced graphics technology down the
road.



What we know about Larrabee

Intel has recently shared more information with the public about its
intents in the realm of general purpose GPU (GPGPU). In a presentation
from March 7 of this year, Intel discussed its data parallelism
programming implementation called Ct. The presentation discusses the
use of flat vectors and very large instruction words (VLIW as utilized
in ATI/AMD's R600). In essence, the Ct application programming
language (API) bridges the gap of allowing it to work with existing
legacy APIs and libraries as well as co-exist with current
multiprocessing APIs (Pthreads and OpenMP), yet provides "extended
functionality to address irregular algorithms."

http://www.tgdaily.com/images/stories/article_images/intel_roadmap/larrabee_board.gif


There are several things to point out from the image above, which is a
block diagram of a board utilizing Larrabee. First is the PCIe 2.0
interface with the system. Intel is currently testing PCIe 2.0 as part
of the Bearlake-X (Beachwood) chipset (commercial name: X38), which
could be coming out as part of the Wolfdale 45 nm processor rollout
late this year or early in 2008. Larrabee won't arrive until 2009, but
our sources indicate that if you buy an X38-based board, you will be
able to run a Larrabee board in such a system.

In the upper right hand corner the power connections indicate 150
watts and 75 watts. These correspond to 8-pin and 6-pin power
connections that we have seen on the recent ATI HD2900XT. Intel
expects the power consumption of such a board to be higher than 150
watts. There are video outputs to the far left and as well as video
in. Larrabee appears to have VIVO functionality as well as HDMI output
based on the audio-in block seen at the top left.
A set of BSI connections are next to the audio in connection. We are
not positive on what the abbreviation stands for but we speculate that
these are connections for using these cards in parallel like ATI's
Crossfire or Nvidia's SLI technologies. Finally, there is the size of
the processor (package). That is over twice the size of current GPUs
as ATI's R600 is roughly 21 mm by 20 mm (420 mm²). Intel describes the
chip as a "discrete high end GPU" on a general purpose platform, using
at least 16 cores and providing a "fully programmable performance of 1
TFlops."

http://www.tgdaily.com/images/stories/article_images/intel_roadmap/larrabee1.gif



Moving on we can see that Larrabee will be based on a multi-SIMD
configuration. From other discussions about the chip across the net,
it would seem that each is scalar that works using Vec16 instructions.
That would mean that, for graphics applications, it could work on
blocks of 2x2 pixels at a time. These "in-Order" execution SIMDs will
have floating point 16 (FP16) precision as outlined by IEEE754. Also
to note is the use of a ring memory architecture. From a presentation
by Intel Chief Architect Ed Davis called "tera Tera Tera", Davis
outlines that the internal bandwidth on the bus will be 256 B/cycle
and the external memory will have a bandwidth of 128 GB/s. This is
extremely fast and achievable based on the 1.7-2.5 GHz projections for
the core frequency. Attached to each core will be some form of
texturing unit as well as a dynamically partitioned cache and ring
stop on the memory ring.

In the final image below you will notice that each device will have a
17 GB/s of bandwidth per link. These links tie into a next generation
Southbridge titled "ICH-n" as this is yet to be determined. From
discussions with those in the industry, it would appear that the
external memory might not be soldered into the board but in fact be
plug in modules. The slide denotes DDR3, GDDR, as well as FBD or fully
buffered DIMMs. It will be interesting to see what form this will
actually be implemented as but that is the fun of speculation.

http://www.tgdaily.com/images/stories/article_images/intel_roadmap/larrabee3.jpg






The current layout of project Larrabee is a deviation of previous
Intel roadmap targets. In a 2005 whitepaper entitled "Platform 2015:
Intel Processor and Platform Evolution for the Next Decade", the
company outlines a series of Xscale processors based on Explicitly
Parallel Instruction Computing or EPIC. Intel has deviated slightly
from its initial roadmap since the release of this paper: Intel sold
Xscale to Marvell last year, which makes it a rather unlikely product
for Larrabee - and could have opened up the discussion for other
processing units.

What is interesting is that rumors that Intel was looking for talent
for an upcoming "project" involving graphics started passing around
already more than a year and a half ago. In August of last year, you
could apply for positions on Career Builder and Intel's own website. A
current generic job description exists on Intel's website.



Concluding note

While this is an interesting approach to graphics, physics, and
general purpose processing, we will be seeing the meat in the final
product as well as the success of acceptance with independent software
vendors (ISVs). In our opinion, the concept of the GPGPU is the most
significant development in the computer environment in at least 15
years. The topic has been gaining ground lately and this new
implementation from Intel could take things to a whole new level. As
for the graphics performance, only time will tell.

It will be interesting to see which role Nvidia will play in Intel's
strategy. Keep a close eye on this one.
 
A

AirRaid

Will Intel be teaming up with Nvidia over Larrabee?


Intel and Nvidia: grand graphics alliance?
More details of the mysterious Larrabee project emerge
Dan Grabham
05 Jun 2007 17:03

Could Intel and Nvidia be preparing to unveil a grand PC graphics
technology alliance? That's the latest rumour following the release of
further details of Intel's in-house graphics project, known as
Larrabee. But is it true?

As we reported back in April, Larrabee is an all-new processor design
from Intel that is due out in 2009 and majors on floating point power.
Larrabee processors are expected to come in several forms including a
dedicated graphics processing chip.

However, according to website TGDaily , Intel will actually be teaming
up with Californian graphics company Nvidia to produce the new chip.
Engineering work for the new processor will be shared between the two
companies.

Intel's engineers will design the floating point units used for shader
calculations while Nvidia will supply the circuitry required to
rasterise and output graphics.

Intel and Nvidia already have an existing patent cross-licensing
agreement, so closer co-operation between the two companies is
certainly plausible. Especially in the context of the recent
acquisition of Canadians graphics outfit by Intel's main rival AMD .

In any case, TGDaily has certainly stumbled upon some interesting new
facts regarding Larrabee. In a document recently released but not
widely publicised, Intel confirms the Larrabee project will initially
give birth to two chips.

The first is the aforementioned graphics chip composed of 16 Larrabee
cores and a slew of dedicated 3D output hardware and video memory.
This flavour of the Larrabee processor will reside on an add-in board
just like existing graphics cards. In other words, it will go head to
head against high performance video boards from Nvidia and ATI.

The second design is a so-called general purpose chip that will be
capable of running the full x86 instruction set. However, with a total
of 24 cores, it will deliver truly mind-bending floating point power -
as much as one trillion floating point operations per second.

Incredibly, that's approximately 40 times the floating point power of
an existing Intel Core 2 dual-core processor. This version of Larrabee
probably isn't suited to all-round desktop processing. Intel has
another, completely separate road map of Core 2-based chips to cater
for that market.

But for specialist applications such medical calculations, for example
protein folding à la Folding at Home, and perhaps even high-end gaming
with sophisticated physics simulations, Larrabee will be a very
interesting indeed.

http://tinyurl.com/2hxmct
 
A

AirRaid

another interesting, though speculative Larrabee article:
___________________

Intel's Larrabee: A killer blow for AMD

Could Larrabee mean another tortuous time for AMD?

tech.co.uk staff
Thursday 07 June 2007 16:19

It's a silly sounding name, Larrabee. But it must fill AMD 's heart
with terror. It's the codename, of course, for a whole family of new
processors being cooked up by Intel . And it promises to add graphics
insult to AMD's existing CPU injuries.

Frankly, things are bad enough for AMD already. Since launch last
summer, the Core 2 processor has been pistol whipping AMD's Athlon
CPUs into burger meat. Meanwhile, AMD's upcoming quad-core competitor,
broadly known as Barcelona, looks like a pretty unambitious effort. It
will certainly have to be some chip to take on Intel's upcoming 45nm
die shrink of the Core 2 chip. Factor in recent reports of a launch
delay for Barcelona and I'm beginning to get the fear about AMD's
ability to compete.

Then there's the spectacular fashion in which the wheels have come off
AMD's recently acquired ATI graphics subsidiary. ATI's all new
flagship graphics DX10 board, the Radeon HD 2900 XT was very late,
extremely underwhelming on arrival and possibly a bit broken. The
midrange variants of the Radeon HD range don't look much healthier:
they've been sent back to the fab for a respin. Not a good sign.

In that context, the emergence of the Larrabee project from Intel is
just further proof of how far ahead of the game Intel appears to be at
moment. For the uninitiated, Larrabee is an all new multi-core
processor design that majors on floating point power.
The full feature set hasn't been revealed as yet, but an official
Intel document turned up on a university website recently that reveals
several fascinating new details.

Try these specs for size. Larrabee will be available in configurations
ranging from 16 to 24 with clock speeds as high as 2GHz and raw
performance in the 1TFlop range. The latter figure is approximately 40
times more than an existing Intel Core 2 Duo chip. Yup, you read it
right. 40 times. And the first Larrabee chips are pencilled in for as
soon as 2009.

Of course, floating point power is just one part of the overall PC
processing equation - Intel will be retaining a conventional CPU
roadmap for general purpose duties based on the existing Core 2
family.

But Larrabee will take Intel into brand new markets. Significantly,
the document confirmed that a variant with full 3D video rendering
capability is on the cards. As we reported earlier this week, the
rumblings on the rumour mill suggest the chip could be a joint effort
with Nvidia.

Either way, the most fascinating aspect of the Larrabee GPU is the
expectation that it could be the first graphics processor to combine
both traditional raster graphics with more advanced ray-tracing
techniques.

Without getting bogged down in the details, suffice to understand that
raster graphics are a bit of a kludge when it comes to simulating
lighting. Ray-tracing is the real deal. Ask any 3D graphics
professional what they think about ray tracing on GPUs and they'll
tell it's a matter of when rather than if.

Of course, AMD and ATI will know perfectly well that ray tracing is
the future. But what must be really worrying is that it presents Intel
with the perfect inflection point to enter the graphics market. ATI
and Nvidia have refined raster graphics to the point where other
companies, including Intel, simply can't compete. But a new age of ray-
traced graphics will level the playing field and might just hand Intel
a chance for the total domination of the PC platform it so dearly
desires.

Jeremy Laird
________________

http://tinyurl.com/2znr39
 
B

BillL

AirRaid said:
another interesting, though speculative Larrabee article:
___________________

Intel's Larrabee: A killer blow for AMD

Could Larrabee mean another tortuous time for AMD?

tech.co.uk staff
Thursday 07 June 2007 16:19

It's a silly sounding name, Larrabee. But it must fill AMD 's heart
with terror. It's the codename, of course, for a whole family of new
processors being cooked up by Intel . And it promises to add graphics
insult to AMD's existing CPU injuries.

Frankly, things are bad enough for AMD already. Since launch last
summer, the Core 2 processor has been pistol whipping AMD's Athlon
CPUs into burger meat. Meanwhile, AMD's upcoming quad-core competitor,
broadly known as Barcelona, looks like a pretty unambitious effort. It
will certainly have to be some chip to take on Intel's upcoming 45nm
die shrink of the Core 2 chip. Factor in recent reports of a launch
delay for Barcelona and I'm beginning to get the fear about AMD's
ability to compete.

Then there's the spectacular fashion in which the wheels have come off
AMD's recently acquired ATI graphics subsidiary. ATI's all new
flagship graphics DX10 board, the Radeon HD 2900 XT was very late,
extremely underwhelming on arrival and possibly a bit broken. The
midrange variants of the Radeon HD range don't look much healthier:
they've been sent back to the fab for a respin. Not a good sign.

In that context, the emergence of the Larrabee project from Intel is
just further proof of how far ahead of the game Intel appears to be at
moment. For the uninitiated, Larrabee is an all new multi-core
processor design that majors on floating point power.
The full feature set hasn't been revealed as yet, but an official
Intel document turned up on a university website recently that reveals
several fascinating new details.

Try these specs for size. Larrabee will be available in configurations
ranging from 16 to 24 with clock speeds as high as 2GHz and raw
performance in the 1TFlop range. The latter figure is approximately 40
times more than an existing Intel Core 2 Duo chip. Yup, you read it
right. 40 times. And the first Larrabee chips are pencilled in for as
soon as 2009.

Of course, floating point power is just one part of the overall PC
processing equation - Intel will be retaining a conventional CPU
roadmap for general purpose duties based on the existing Core 2
family.

But Larrabee will take Intel into brand new markets. Significantly,
the document confirmed that a variant with full 3D video rendering
capability is on the cards. As we reported earlier this week, the
rumblings on the rumour mill suggest the chip could be a joint effort
with Nvidia.

Either way, the most fascinating aspect of the Larrabee GPU is the
expectation that it could be the first graphics processor to combine
both traditional raster graphics with more advanced ray-tracing
techniques.

Without getting bogged down in the details, suffice to understand that
raster graphics are a bit of a kludge when it comes to simulating
lighting. Ray-tracing is the real deal. Ask any 3D graphics
professional what they think about ray tracing on GPUs and they'll
tell it's a matter of when rather than if.

Of course, AMD and ATI will know perfectly well that ray tracing is
the future. But what must be really worrying is that it presents Intel
with the perfect inflection point to enter the graphics market. ATI
and Nvidia have refined raster graphics to the point where other
companies, including Intel, simply can't compete. But a new age of ray-
traced graphics will level the playing field and might just hand Intel
a chance for the total domination of the PC platform it so dearly
desires.

Jeremy Laird
________________

http://tinyurl.com/2znr39

What would worry me is how much Intel could charge for CPU's if they did
'kill' of AMD :blush:(

BillL
 
A

A Guy Called Tyketto

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

In alt.comp.hardware.amd.x86-64 BillL said:
What would worry me is how much Intel could charge for CPU's if they did
'kill' of AMD :blush:(

BillL

Picture the days before AMD and Cyrix came out with their 486
and 586-compatible CPUs, and how much they cost back then (with
relation to cost of living, etc.). Could be headed back that way.

BL.
- --
Brad Littlejohn | Email: (e-mail address removed)
Unix Systems Administrator, | (e-mail address removed)
Web + NewsMaster, BOFH.. Smeghead! :) | http://www.wizard.com/~tyketto
PGP: 1024D/E319F0BF 6980 AAD6 7329 E9E6 D569 F620 C819 199A E319 F0BF

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (GNU/Linux)

iD8DBQFGaZ5HyBkZmuMZ8L8RAhHPAJ4/4Ef/or3B2edgzASstS2QbTE4OgCg2CPq
MNHQDPrFxNp8BiOKRGnpspM=
=WFDr
-----END PGP SIGNATURE-----
 
G

GMAN

What would worry me is how much Intel could charge for CPU's if they did
'kill' of AMD :blush:(

BillL
Add that to the fact that if there were no AMD, there would be no ATI. there
would be no competition in the video market. I want the choice of which video
card i want to buy. i dont want Nvidia as the ONLY choice. I own some of both
brands.
 
L

Larry Roberts

What would worry me is how much Intel could charge for CPU's if they did
'kill' of AMD :blush:(

BillL

I remember the prices on computers before AMD, and Cyrix where
competitors, and there is no way I'd be able to afford a system if
prices where still that high.
 
G

GTD

What would worry me is how much Intel could charge for CPU's if they did
'kill' of AMD :blush:(

BillL
That's an if that we'll likely never have to deal with, , If Mac can
survive all these years, AMD can hang on, but unlike Mac, AMD is much
more likely to be able to take back the lead, or even just take back
large portions of the market. I do wish they would get their crap
together, but nVidia would have to have something on the order of 50%
more performance per dollar over ATI to drag me away, , , I've never had
much luck with anything nVidia, and had little to no problems with ATI.
 
N

Nate Edel

In comp.sys.intel A Guy Called Tyketto said:
Picture the days before AMD and Cyrix came out with their 486
and 586-compatible CPUs, and how much they cost back then (with
relation to cost of living, etc.). Could be headed back that way.

There's always Via; for that matter, there'd be whomever bought up the
remains of AMD's IP - I wouldn't expect a whole lot of new processors to
take on Intel's latest, but if you look at how old you can get and still
have a pretty useable system, current A64 X2 designs if pushed cheaply out
of Asian high-volume fabs could probably take Intel on in the low end for
long enough for a new competitor to emerge.
 
E

Ed Forsythe

AMD has been bi*ch slapping Intel for years and I was rooting for them (in
spite of being an Intel fanboy) because I'm an underdog type. I even planned
to make my next box AMD. However, I had a gnawing suspicion that Intel was
going to eventually unload their full power in salvo after salvo in an
attempt to blow AMD out of the water. Intel has countered the bi*ch slapping
with what appears to be a knockout punch! Now I have another gnawing
suspicion, and that is AMD is *not* going to roll over and die. The AMD
troops are tough and tenacious. It may be a struggle but they have been
there before. I believe it's going to be a heck of a fight and AMD will rise
again to the position of just as good *and* cheaper! That's what made them
and they will do it again. *But* my next box will be Intel because it's
going to take AMD a few years to get the job done and I'm impatient. :)
Power to the people!
 
G

GTD

I don't follow the stuff much, but I thought I saw a report that AMD
was actually doing well profit wise back in the first quarter. Not
sure about Intel. Anyone follow that stuff ? Apparently AMD is
making money on all the low end stuff they sell.
That may be, but Intel isn't doing so bad on the bottom end these days,
and as we all know, upgradability is a big seller. If the trend stays as
it is, people will start opting for a slightly more expensive intel
setup if it means they are able to upgrade to a better product, even
though, IME, I've never upgraded to the top of one form factor, but
rather to the mid or so into the next higher (ie, didn't upgrade a P3
system to the fastest P3 out there, instead went to a faster, yet
mid-level P4, or instead of upgrading my socket A system to it's max, I
went to a mid-level socket 939 setup).
 
G

greenspan

AMD has been bi*ch slapping Intel for years and I was rooting for them (in
spite of being an Intel fanboy) because I'm an underdog type. I even planned
to make my next box AMD. However, I had a gnawing suspicion that Intel was
going to eventually unload their full power in salvo after salvo in an
attempt to blow AMD out of the water. Intel has countered the bi*ch slapping
with what appears to be a knockout punch! Now I have another gnawing
suspicion, and that is AMD is *not* going to roll over and die. The AMD
troops are tough and tenacious. It may be a struggle but they have been
there before. I believe it's going to be a heck of a fight and AMD will rise
again to the position of just as good *and* cheaper! That's what made them
and they will do it again. *But* my next box will be Intel because it's
going to take AMD a few years to get the job done and I'm impatient. :)
Power to the people!

I don't follow the stuff much, but I thought I saw a report that AMD
was actually doing well profit wise back in the first quarter. Not
sure about Intel. Anyone follow that stuff ? Apparently AMD is
making money on all the low end stuff they sell.

Keep in mind the cheap computer manufacturers can't overclock their
6300's like the enthusiast can.
 
T

The Kat

That may be, but Intel isn't doing so bad on the bottom end these days,
and as we all know, upgradability is a big seller. If the trend stays as
it is, people will start opting for a slightly more expensive intel
setup if it means they are able to upgrade to a better product,

How many 'home' users do you know that need a quad core,
or even a dual core, CPU?




--

Lumber Cartel (tinlc) #2063. Spam this account at your own risk.

This sig censored by the Office of Home, Land & Planet Insecurity...

Remove XYZ to email me
 
G

Guest

The Kat said:
How many 'home' users do you know that need a quad core,
or even a dual core, CPU?

Many. Now that I think about it, maybe even most. Between
digital photo processing, video capture and editing software etc
I think quite a few people are seeing (or could see) benefits with
dual core CPUs. Quad core is a different story at least for now.
 
B

Bill Davidsen

AirRaid said:
another interesting, though speculative Larrabee article:
___________________

Intel's Larrabee: A killer blow for AMD

Could Larrabee mean another tortuous time for AMD?
Just how much AMD stock did you sell short?
 
G

GTD

Many. Now that I think about it, maybe even most. Between
digital photo processing, video capture and editing software etc
I think quite a few people are seeing (or could see) benefits with
dual core CPUs. Quad core is a different story at least for now.

True, but NEED, , is irrelevant. If we only got what we NEEDED, many
would likely not have computers at all. They buy them anyways, , look at
the video game console market, , ,HUGE, , but absolutely no NEED. . .
 
E

egad

Many. Now that I think about it, maybe even most. Between
digital photo processing, video capture and editing software etc
I think quite a few people are seeing (or could see) benefits with
dual core CPUs. Quad core is a different story at least for now.

LOL. Dream on. The girl next door has a intel p4 1600 running slow
ddr memory. One friend has a 66mhz motherboard and a 466 celeron.
Another has a Mac. My mother won't even consider a computer.

Yes, if you want to do video stuff, you could use all the help you
can get. The real problem is microsoft. Their operating systems
make everything slow. Go ahead and overclock to 3.6ghz and wait for
windows to catch it's breath.
 
G

GTD

LOL. Dream on. The girl next door has a intel p4 1600 running slow
ddr memory. One friend has a 66mhz motherboard and a 466 celeron.
Another has a Mac. My mother won't even consider a computer.

Yes, if you want to do video stuff, you could use all the help you
can get. The real problem is microsoft. Their operating systems
make everything slow. Go ahead and overclock to 3.6ghz and wait for
windows to catch it's breath.
I really don't believe that, , A friend of mine uses both Linux and XP,
says there is little difference in performance. Are you trying to say a
windows based computer won't benefit from a faster CPU? If not, then
please explain the "catch it's breath" theory. Anyone using a 466
celeron isn't really doing anything with that computer like most other
people are, and would probably be just as well with webtv (if that even
still exists). Even my parents are doing more with their computer than
can be convienently done on something like a p4-1600, and would
especially benefit from a dual-core system since they do alot at one
time. My kid has a A64 3500+ and it's a bit slow for what he does with
it. Of course 'nospam' was referring to people that use a computer in
the beginning, so the fact that your mother wouldn't benefit is irrelevant.
 
T

The Kat

Even my parents are doing more with their computer than
can be convienently done on something like a p4-1600, and would
especially benefit from a dual-core system since they do alot at one
time.

It's not what they're doing, it's whether the programs they use are
multi-CPU aware. Most programs aren't.
My kid has a A64 3500+ and it's a bit slow for what he does with
it.

It's almost guaranteed that you're kids are doing MUCH more with
their computer than you are.

The VAST majority of computer users don't come close to using the
full power of the average single-core system of 3 years ago, let
alone needing a dual-core of today.




--

Lumber Cartel (tinlc) #2063. Spam this account at your own risk.

This sig censored by the Office of Home, Land & Planet Insecurity...

Remove XYZ to email me
 
G

GTD

The said:
It's not what they're doing, it's whether the programs they use are
multi-CPU aware. Most programs aren't.
Not relevant when it comes to, as I said "doing alot at one time".
Multi-core allows one to say, process a video, without it slowing down
all the other stuff you do AT THE SAME TIME.

It's almost guaranteed that you're kids are doing MUCH more with
their computer than you are.
I assure you he's not, where do you get that idea?

The VAST majority of computer users don't come close to using the
full power of the average single-core system of 3 years ago, let
alone needing a dual-core of today.

I disagree, there are TONS of casual users processing video, playing
games, ripping their cds to mp3s for their ipod and such that max out
their cpu quite a bit. True, that's probably not happening a large
portion of the time they are using the computer. The arguement seems to
be tha tall that could be done with a much slower cpu, it would just
take longer: If one doesn't have the time that a slower cpu makes a task
take, then that slower cpu really CAN'T do the job. .

This is kinda a retarded arguement that we've all been over before, ,
it's like saying we could all perfectly well make do with a 5hp
lawnmower motor in our cars, which would only push the car to 3 MPH,
because it would still get us exactly where we are going, never mind the
10 minute commute to work is now 2 hours. . .
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top