Moore's law


B

bob

Why does Moore's law seem to work?

I would expect computer technology to advance at a more sporadic and
less even pace.
 
Ad

Advertisements

J

John McGaw

Why does Moore's law seem to work?

I would expect computer technology to advance at a more sporadic and
less even pace.

The rate of improvement _is_ sporadic but it is always improving. It is
possible to pick an arbitrary 24 month period (using the 24-month
version of Moore's Law rather than the 18-month version) where the
number of transistors did not double. But is is also possible that if
you started counting a month later there might be more than a doubling.
So far it has always averaged out pretty well.

To me the amazing thing is that, despite a number of physical limits
that have been predicted to end the regular doubling over the years,
some research team always seems to find a way around that limit setting
up for the next limit. Of course, eventually it may well be a limit of
"how do you make a transistor from fewer than three atoms?" that finally
breaks the law's run of success.
 
P

philo

John McGaw said:
The rate of improvement _is_ sporadic but it is always improving. It is
possible to pick an arbitrary 24 month period (using the 24-month
version of Moore's Law rather than the 18-month version) where the
number of transistors did not double. But is is also possible that if
you started counting a month later there might be more than a doubling.
So far it has always averaged out pretty well.

To me the amazing thing is that, despite a number of physical limits
that have been predicted to end the regular doubling over the years,
some research team always seems to find a way around that limit setting
up for the next limit. Of course, eventually it may well be a limit of
"how do you make a transistor from fewer than three atoms?" that finally
breaks the law's run of success.


Use sub-atomic particles, of course <G>
 
S

surma.anthony

Why does Moore's law seem to work?

I would expect computer technology to advance at a more sporadic and
less even pace.

Like the previous poster said, it's an average. But to better answer
the "why" part of your question:

I'm going from memory here, but Moore's Law is based on observation
rather than pure theory. Every few years scientists expect Moore's
Law to break down, but it simply hasn't yet -- and I don't think it
will. For example, just recently humans started making multiple core
CPU's because simply wasn't practical anymore. Quad core CPU's exist
now, and it won't stop there. This goes back to the previous poster
mentioning how we always seem to find a way to keep Moore's Law
holding true.

So nothing says Moore's Law must hold true, it just has because of
Human Will, Determination, and Capitalism (add whatever reasons you
wish) ... and nothing has surfaced to oppose those reasons. Maybe
nothing ever will.

If you know exactly how much raw computational power would be needed
to brute force the creation of human level AI, then using Moore's Law
you could anticipate the latest possible date that affordable AI would
be developed.

Right now an educated guess could be made based upon 1) the raw
computational power required by currently developed AI visual
recognition systems, 2) the number of neurons in a human optic nerve,
and 3) the number of neurons in the whole brain.

Based on that, the date should be around 2050 for human level AI -- at
the price of today's home PC. :D
 
K

kony

Like the previous poster said, it's an average. But to better answer
the "why" part of your question:

I'm going from memory here, but Moore's Law is based on observation
rather than pure theory. Every few years scientists expect Moore's
Law to break down, but it simply hasn't yet -- and I don't think it
will. For example, just recently humans started making multiple core
CPU's because simply wasn't practical anymore. Quad core CPU's exist
now, and it won't stop there. This goes back to the previous poster
mentioning how we always seem to find a way to keep Moore's Law
holding true.


On the other hand, we could also say that having to go to
multicored CPUs was in itself a failure to keep up with
Moore's law, that at any time they could've done that
instead of today, it was only when they couldn't make
further (cost-effective, viable for the market at large)
processors that kept up with Moore's law that they had to
start using multiple cores to get significantly more
performance gain.
 
J

John McGaw

kony said:
On the other hand, we could also say that having to go to
multicored CPUs was in itself a failure to keep up with
Moore's law, that at any time they could've done that
instead of today, it was only when they couldn't make
further (cost-effective, viable for the market at large)
processors that kept up with Moore's law that they had to
start using multiple cores to get significantly more
performance gain.

Possible. But probably not directly. Moore's Law really concerns
transistors in an integrated circuit and doesn't specify what sort of
IC. Certainly it is not limited to microprocessors only. When you look
at other sorts of circuits such as memory, GPUs, DSPs and such the
progress in packing in more transistors is really astounding. I've
always figured that going to multiple cores in general purpose
processors was more a matter of getting around inefficiencies in
real-world multi-tasking operating systems. Avoid context switching and
you avoid the overhead. Avoid cache flushes and you avoid the overhead
of reloading. Perhaps when our desktop OS with 114 running processes has
a separate core and cache for each of them?... <g>

But as long as PC users, or even a small subset of PC users, keeps
demanding more and more we can be pretty certain that Intel and AMD will
keep piling more circuitry onto their chips even when it gets far past
the point of diminishing returns. (if it hasn't already)
 
Ad

Advertisements

C

comment

Like the previous poster said, it's an average. But to better answer
the "why" part of your question:

I'm going from memory here, but Moore's Law is based on observation
rather than pure theory. Every few years scientists expect Moore's
Law to break down, but it simply hasn't yet -- and I don't think it
will. For example, just recently humans started making multiple core
CPU's because simply wasn't practical anymore. Quad core CPU's exist
now, and it won't stop there. This goes back to the previous poster
mentioning how we always seem to find a way to keep Moore's Law
holding true.

So nothing says Moore's Law must hold true, it just has because of
Human Will, Determination, and Capitalism (add whatever reasons you
wish) ... and nothing has surfaced to oppose those reasons. Maybe
nothing ever will.

If you know exactly how much raw computational power would be needed
to brute force the creation of human level AI, then using Moore's Law
you could anticipate the latest possible date that affordable AI would
be developed.

Right now an educated guess could be made based upon 1) the raw
computational power required by currently developed AI visual
recognition systems, 2) the number of neurons in a human optic nerve,
and 3) the number of neurons in the whole brain.

Based on that, the date should be around 2050 for human level AI -- at
the price of today's home PC. :D
Indeed, 2050 seems to be the breakdown date for Moore's law from this
extrapolation too.
see http://www.cise.ufl.edu/~mpf/lec3.html with relevant extract below on
max information density.
"The closest bound, which is not very fundamental, is probably the empirical
bound on entropy density for matter at normal temperatures and pressures of
about 1 bit per cubic Angstrom. As we mentioned in lecture, the Moore's law
track for transistor sizes has us reaching this scale in RAMs by about 2050.
Non-random-access memories (eg disks) are probably slated to hit this realm
even sooner."
 
K

Ken Maltby

comment said:
Indeed, 2050 seems to be the breakdown date for Moore's law from this
extrapolation too.
see http://www.cise.ufl.edu/~mpf/lec3.html with relevant extract below on
max information density.
"The closest bound, which is not very fundamental, is probably the
empirical bound on entropy density for matter at normal temperatures and
pressures of about 1 bit per cubic Angstrom. As we mentioned in lecture,
the Moore's law track for transistor sizes has us reaching this scale in
RAMs by about 2050. Non-random-access memories (eg disks) are probably
slated to hit this realm even sooner."
----

Perhaps, Moore's Law could be looked at as a reflection of
the progression of the demand/requirements. The doubling
of transistor count, speeds, and size reductions only being the
means to conform to the requirement. Where other means,
such as multiple cores being introduced would have no effect
on the progression or the "Law".

Luck;
Ken
 
S

surma.anthony

I don't believe that multiple cores for CPUs somehow makes their
performance improvements over single core CPUs somehow questionable.
Parallelism isn't "cheating".
 
G

GT

Like the previous poster said, it's an average. But to better answer
the "why" part of your question:

I'm going from memory here, but Moore's Law is based on observation
rather than pure theory. Every few years scientists expect Moore's
Law to break down, but it simply hasn't yet -- and I don't think it
will. For example, just recently humans started making multiple core
CPU's because simply wasn't practical anymore. Quad core CPU's exist
now, and it won't stop there. This goes back to the previous poster
mentioning how we always seem to find a way to keep Moore's Law
holding true.

Well that also depends on how you measure processing power - my single
threaded application is no faster on a quad core processor as on a single
core processor at the same frequency, so from my point of view moore's law
stalled a year or 2 ago, when dual core became mainstream and the research
into faster cores stopped!
 
G

GT

I don't believe that multiple cores for CPUs somehow makes their
performance improvements over single core CPUs somehow questionable.
Parallelism isn't "cheating".

As a way of evaluating Moore's law (processing power doubles every 18
months), it is cheating as the compilation that ran for 10 minutes on a
single core 2GHz processsor, still runs for 10 minutes on a 2GHz dual core
processor, only half of the transisters that are being counted are not being
used!

The law doesn't say 'usable' processing power, so perhaps dual core is
valid, although a highly false impression of power! Buying 2 Ferraris
suggest higher status than 2 ferrari's, but you can't go any faster!
 
Ad

Advertisements

K

kony

I don't believe that multiple cores for CPUs somehow makes their
performance improvements over single core CPUs somehow questionable.
Parallelism isn't "cheating".


Maybe not cheating in that they disclose the design, but
it's not as though one can claim it is a linear extension of
the past CPU performance gains.

The performance improvements are very questionable for a
large part of the PC userbase. The typical PC user has one
app in the foreground that is not well optimized for (if
using at all) multithreading, and no background tasks with
immediate priority escalation or performance requirements
(they just sit idle until brought into the foreground
again).

It is certainly possible to pose some hypothetical scenario
where certain uses will benefit more from multi-core CPUs,
but these multi-core CPUs are not only being marketed to
those users/uses.
 
K

Ken Maltby

kony said:
Maybe not cheating in that they disclose the design, but
it's not as though one can claim it is a linear extension of
the past CPU performance gains.

The performance improvements are very questionable for a
large part of the PC userbase. The typical PC user has one
app in the foreground that is not well optimized for (if
using at all) multithreading, and no background tasks with
immediate priority escalation or performance requirements
(they just sit idle until brought into the foreground
again).

It is certainly possible to pose some hypothetical scenario
where certain uses will benefit more from multi-core CPUs,
but these multi-core CPUs are not only being marketed to
those users/uses.

I suspect that Moore's Law will still be observable when
we are counting Muon Gates instead of transistors, and it
won't matter how they are organized.

Luck;
Ken
 
Ad

Advertisements

D

DevilsPGD

In message <[email protected]> "GT"
As a way of evaluating Moore's law (processing power doubles every 18
months), it is cheating as the compilation that ran for 10 minutes on a
single core 2GHz processsor, still runs for 10 minutes on a 2GHz dual core
processor, only half of the transisters that are being counted are not being
used!

Sure. We shouldn't count those transistors used for 32-bit instructions
either, since when Moore's law was written, we didn't have any of that
newfangled 32-bit crap.

In other words, assuming software keeps up with hardware is either fair,
or not.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top