The death of non-x86 is now at hand?

A

Alex Johnson

Robert said:
My statement as you quoted it is still literally true, but it is also
true that not all of the energy dissipated by the processor has to
contribute to the warm leg effect.

Yes, I suppose in a non-nit way, your statement is correct or close
enough for approximation. Just didn't seem right from a basic physics
POV. Another thing occured to me besides heat and light. Radio waves!
Computers do create enough EM interference that you can hear it on a
cell or portable phone (especially 2.4GHz frequency sets). Most of that
is probably the power supply, but I'm pretty confident the CPU also
radiates radio frequencies.
The important exception, and one that does confuse me, is that I don't
know how to account for the energy dissipated in voltage regulation
circuitry in the immediate vicinity of the CPU. I'm sure that Intel
does not include that power dissipation in its quoted numbers, I
suspect that it is a non-negligible amount of heat, it does contribute
to the warm leg effect, and I wonder what category people would put it
into if they had to make all the means of power dissipation equal the
actual power being drawn at the DC connection of the laptop.

It is probably a significant number, but I doubt it is more than a
fraction of the CPU itself (1's of watts vs 10's of watts?). Otherwise
there'd be lots more chatter about having the right battery for your CPU
so that voltage stepping losses don't eat into your battery life budget.

Alex
 
R

Robert Myers

Yes, I suppose in a non-nit way, your statement is correct or close
enough for approximation. Just didn't seem right from a basic physics
POV. Another thing occured to me besides heat and light. Radio waves!
Computers do create enough EM interference that you can hear it on a
cell or portable phone (especially 2.4GHz frequency sets). Most of that
is probably the power supply, but I'm pretty confident the CPU also
radiates radio frequencies.
To introduce some specific terminology, to the extent that the CPU is
surrounded by by a solid, all of the energy that is dissipated by the
CPU has to escape either as a signal with useful and energy-carrying
information going out through a pin, be converted into a phonon or
escape as a photon.

There must be whole classes of quantum states that correspond to the
signals that make it onto PC board traces, and I truly have no idea
what to call them, since some of them must correspond to correlated
pairs of electrons, one on each trace if differential signalling is
being used, and they would qualify to be called a "particle" in the
same way that the Cooper pairs of superconductivity qualify to be
called particles. If it can be imagined, surely someone has written a
paper about it.

Electromagenetic radiation at any frequency is carried by photons.
The phonons are what we normally think of as heat. Very little energy
gets out of the CPU in any fashion whatsoever without at some point
being scattered and at least temporarily converted to a phonon. If
you _really_ wanted to look at all the possibiliteis, Feynman diagrams
would be helpful for categorizing all the exotic combinations of
energy pathways (taxonomy: ugh!).

There is, I am sure, a finite quantum transition amplitude for
production of every conceivable quantum particle that anybody has ever
heard of, but on balance it is clear that they contribute nothing that
anybody knows how to calculate to the energy budget of the CPU.

If you've ever once actually played with Feynman diagrams, which are
almost necessary even to keep track of all the terms in the equations,
you will quickly discover that there is both more and less to all of
this than meets the eye.

Nature knows nothing (and I believe that this statement would have
stood up to scrutiny by Prof. Feynman himself) of the discrete
interactions that field theoreticians and solid state physicists draw
on blackboards.

Instead, there are an infinite number of interactions, involving an
infinite number of Feynman diagrams, going on all of the time. Some
of those interactions involve evanescent intermediate states that are
as real as the xerox toner on the printed page, but they violate
either energy or momentum conservation or both so we call them
"virtual" particles. All of this can happen without a CPU present.
In fact, all of it can happen without anything present at all. Go
google for vacuum fluctuations if you don't believe me.

Prof. Feynman might give me an argument about the following, but much
of what physicists routinely peddle as identifiable physicial
phenomena (an example being virtual photons) are really nothing more
than artifacts of the way he chose to go about solving the
(non-linear) equations of QED. This is clear enough to me because
I've used the same formal apparatus in a classical context, where it
is clear that you can write the equations down, you can draw the
Feynman diagrams, and you can draw useful conclusions, but no new
physics spring into existence because you choose to solve non-linear
equations in a particular way.

Your posts reverberate with the confusions of someone who has been
bamboozled in a physics classroom. *Nobody, nobody, nobody* knows
enough physics to answer your question with absolute, final certainty.
If the "standard model" is to be believed, the most ordinary F=ma type
calculation of freshman physics involves a Higgs field that is carried
by bosons that no one has managed to capture in a measurement. That
way lies madness. Be happy if you are in computer science or
electrical engineering and don't have to sit through arguments about
these things.
It is probably a significant number, but I doubt it is more than a
fraction of the CPU itself (1's of watts vs 10's of watts?). Otherwise
there'd be lots more chatter about having the right battery for your CPU
so that voltage stepping losses don't eat into your battery life budget.
It is, in any case, more important than any kind of exotic particle
interaction you and I both understand less well.

RM
 
A

Alex Johnson

What do you DO for a living?? :)

Sorry, my physics is limited to:
classical mechanics, electromagnetic field theory, a couple classes on
the subatomic side of transistors and diodes (ugh!), and a personal
interest in quantum mechanics / quantum computing research read in
journals and papers

Alex
 
R

Robert Myers

What do you DO for a living?? :)
I often wonder the same thing myself. If you have a wide-ranging
knowledge of strange subjects, you wind up working for strange people.
No offense intended to any employer of my services, past, present, or
future.
Sorry, my physics is limited to:
classical mechanics, electromagnetic field theory, a couple classes on
the subatomic side of transistors and diodes (ugh!), and a personal
interest in quantum mechanics / quantum computing research read in
journals and papers
I did my undergraduate degree in physics, did enough course work to
get a Ph.D. in physics, and actually completed my Ph.D. in a
discipline called Theoretical and Applied Mechanics, with a
specialization in fluid mechanics. For me, moving to a classical
discipline was an inspired guess motivated by the understanding that I
would fully master the mathematics of a subject that interested me
(General relativity), without all the hocus pocus and without having
to figure out how to put a nearly useless degree to work somewhere
other than a bomb laboratory or a university. In the process, I
learned that gravitation wasn't the only subject where physicists had
managed to turn mathematics into mysticism. Every bit of field theory
(short of string theory, which I am now too old even to try to learn)
shows up in the study of turbulence, again without all the hocus
pocus.

How, then, do I wind up hanging around Usenet groups dedicated to
computer hardware? It's an occupational hazard for nearly anyone who
works with highly non-linear equations, which is why you will find
yourself bumping up against people working on turbulence (classical
fluid mechanics) and quantum chromodynamics (quantum physics).

I have worked on practically anything that flies by any conceivable
method you could imagine at Mach numbers from effectively zero to
numbers of the kind that you get by trying to reenter the atmosphere
from orbit, submarines (which also fly, albeit in a much more dense
fluid), weirdly statistical processes arising from fairly
pedestrian-looking physics, propagation of every kind of weird wave
you could imagine in every conceivable medium you could imagine, weird
lasers, weird optics, detection and signal processing algorithms for
all kinds of weird physical phenomena, and instrumentation for
detecting the most improbable of physical quantities not including
subatomic particles. And yes, I have actually designed and built
circuits both digital and analog.

In the process, I have repeatedly found myself in roomsful of people
who are the presumptive world's experts on whatever weird physics was
the subject of the day. With practice, you learn not to allow
yourself to be intimidated by people who are different mostly in that
they talk in a specialized set of acronyms. In the end, it's all just
math.

Once you've worked yourself into the corner of trying to problems that
are mathematically intractable using computers, you might as well go
the whole distance and go after the most computationally intensive
problems there are, lists of which are well known, and some of which
have enormous economic consequences.

My only regret in life is that I passed though a great center of
learning for both solid state physics and large scale computation on
my way to my Ph.D., and didn't fully capitalize on those
opportunities, because I didn't yet see how interesting the subjects
would become to me.

RM
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top