Advantages of Parallel Hz

D

Donald

Radium said:
However, 1Hz x 1Billion is an absurdity to show an example. How
about a MHz by a thousand? If the parallelism problem can be cracked
there are all sorts of games that can be played with all those free
transistors.


Well, "parallel Hz" is meant for problems that are serial.

"Parallel Hz" actually doesn't have anything to do with whether the
task is parallelizable or not.

The parallelism you are describing has to do with the bits being
parallel [such as in a parallel printer].

"Parallel Hz" is a different story.
Still peddling your dream, eh.
 
K

krw

I thought that when you made a low Vth, you always got a higher
leakage. It seems that this must be true if the GM is to remain
finite at the extreme of Vth=0.

True. Physics can't be cheated (that easily). ;-) You can operate
in the sub-threshold region though.
It is likely the leakage in the supply capacitors would be more than
in the wires. Also if the supply has Schottky diodes in it, the
leakage in them will be high. If we are considering subvolt Vdd then
the recifier would likely be a MOSFET.


I see it more as an extreme case to make the argument clearer, but yes
it is absurd.

Diffeent words...
Parallelism has been cracked for special case problems. The very
large CPU time users seem also to be the places where the problem is
parallel in nature. Modeling the flow of fluids, explosions and the
propigation of waves through nonuniform mediums are the things that
come to mind quickly.

No parallelism hasn't been "cracked". Some problems are
"embarrassingly parallel".
 
K

krw

Well, "parallel Hz" is meant for problems that are serial.

Well, then you're in for a world of hurt. The two can't mix. How do
you propose to parallelize:

A=B+C
D=A+E
"Parallel Hz" actually doesn't have anything to do with whether the
task is parallelizable or not.

Then it's a meaningless word (no shock here).
The parallelism you are describing has to do with the bits being
parallel [such as in a parallel printer].

No, it has to do with tasks that can be done in parallel. That is,
they have no interdependence.
"Parallel Hz" is a different story.

No question. It looks like a ghost story.
 
M

MooseFET

Well, "parallel Hz" is meant for problems that are serial.

"Parallel Hz" actually doesn't have anything to do with whether the
task is parallelizable or not.

As has been pointed out, yes it does have to do with the task being
parallel or not. Consider the sort of program where every step depends
on the result of the previous step. If the processor take one second
to do a step, the next step must wait that one second. Now consider
one where every step stands alone. All of the steps can be started as
soon as there is hardware free to do it so the result comes out in
much less time.

The parallelism you are describing has to do with the bits being
parallel [such as in a parallel printer].

No, the parallelism we are describing is a property of the problem not
the hardware. Consider these two tasks:

(1)
Replace all negative numbers with zero.

(2)
Replace the first negative number with zero.

The first sounds like more work but the second would actually take
longer. To do the second, you must test values in turn to find the
first negative value. In the first, you only need to look at a single
value in each CPU section so if you have enough sections, the whole
operation would happen in one stroke.
 
I

Invalid

krw said:
[ Kilobytes of unnecessary quoted text deleted]
That is a naturally parallel problem.

****ING LEARN TO SNIP. PRAT.
I do when it interests me, dilbert. Id didn't, and I didn't. Live
with it, or not.

Here is what happens to those who refuse to snip:

*PLONK*

Please note that for every plonk there are ten people who silently killfile you.
 
R

Radium

My dream PC is as hardware, real-time, and digital as possible. In
addition, it uses the least amount of buffering required [hopefully
none] and experiences the least amount of latency possible [again,
hopefully none].
This may be your dream but it may also be a nightmare. You also said
you wanted low power and no fan. If you want a lot of speed, you
really want a good cooling system and a whole lot of power. If you
want to make a faster system with low power, you want to make use of
things like lookup tables and hashes.

Couldn't the problem of excessive heat and large use of power be
solved [or at least mitigated] by using lower voltages while still
running things in real-time [and with the least amount of storage,
software, buffering, and latency possible] and not using fans?
 
E

Eli the Bearded

(stuff)

162 lines of ASCII art header, and you are plonking someone for
excessive quoting? Pot, meet kettle.

Elijah
 
M

MooseFET

My dream PC is as hardware, real-time, and digital as possible. In
addition, it uses the least amount of buffering required [hopefully
none] and experiences the least amount of latency possible [again,
hopefully none].
This may be your dream but it may also be a nightmare. You also said
you wanted low power and no fan. If you want a lot of speed, you
really want a good cooling system and a whole lot of power. If you
want to make a faster system with low power, you want to make use of
things like lookup tables and hashes.

Couldn't the problem of excessive heat and large use of power be
solved [or at least mitigated] by using lower voltages while still
running things in real-time [and with the least amount of storage,
software, buffering, and latency possible] and not using fans?

No. It largely can't be avoided. If you have to do your sine function
from first principles and you want speed, you need a huge number of
operations per second.
 
G

Guest

My dream PC is as hardware, real-time, and digital as possible. In
addition, it uses the least amount of buffering required [hopefully
none] and experiences the least amount of latency possible [again,
hopefully none].
This may be your dream but it may also be a nightmare. You also said
you wanted low power and no fan. If you want a lot of speed, you
really want a good cooling system and a whole lot of power. If you
want to make a faster system with low power, you want to make use of
things like lookup tables and hashes.

Couldn't the problem of excessive heat and large use of power be
solved [or at least mitigated] by using lower voltages while still
running things in real-time [and with the least amount of storage,
software, buffering, and latency possible] and not using fans?

I'm just musing, please correct where I'm right off track.

It's current that causes heat rise (I^2R). For things to work, current
must flow, and for current to flow, voltage must be applied. For
minaturisation, smaller currents will need higher voltages.
For things to go faster, larger currents are required, and so larger
voltages are required to push these currents. If you limit to non-fan
cooling, you limit current, and therefore limit speed and capacity.

Be gentle with me, I'm trying to learn. jack
 
K

kony

I'm just musing, please correct where I'm right off track.

It's current that causes heat rise (I^2R).

.... and voltage (accuracy of voltage control method), and
frequency.


For things to work, current
must flow, and for current to flow, voltage must be applied.

Fair enough


For
minaturisation, smaller currents will need higher voltages.

No with current tech, miniaturization means lower voltage is
required but losses become higher. Thus current higher
too.

For things to go faster, larger currents are required, and so larger
voltages are required to push these currents.

No, see above. For things to get substantially faster we'd
need a shift in technology, either different manufacturing
method and material or what they're presently doing, just
tacking on addt'l cores they can fit in space allowed by a
certain process size shrink - onto the point where it
becomes unaffordable to make it a larger core.


If you limit to non-fan
cooling,

Why this arbitrary stipulation?



you limit current, and therefore limit speed and capacity.

Yes within any given tech, if you target a thermal design
power low enough that *reasonable* passive cooling is
possible, you will limit _voltage_ which inherantly limits
current, and these two limitations using a given tech,
inherantly limit the ceiling speed that tech can sustain
stabily. I don't know what you mean by capacity except to
the extent that a given tech and process size with a given
voltage will require X amount of current per amount of
*circuitry*, or in a logical sense, limiting circuitry or
stable attainable speed within the aforementioned
limitations (given same tech) limits performance (though
optimizable for certain tasks the more one specializes in
specific function CPUs instead of broader operations like in
a PC).
 
G

Guest

... and voltage (accuracy of voltage control method), and
frequency.

But isn't it the curent that does the heating?
Fair enough

No with current tech, miniaturization means lower voltage is
required but losses become higher. Thus current higher
too.


No, see above. For things to get substantially faster we'd
need a shift in technology, either different manufacturing
method and material or what they're presently doing, just
tacking on addt'l cores they can fit in space allowed by a
certain process size shrink - onto the point where it
becomes unaffordable to make it a larger core.

Why this arbitrary stipulation?

Because this is what Radium stipulated.
Yes within any given tech, if you target a thermal design
power low enough that *reasonable* passive cooling is
possible, you will limit _voltage_ which inherantly limits
current, and these two limitations using a given tech,
inherantly limit the ceiling speed that tech can sustain
stabily. I don't know what you mean by capacity except to
the extent that a given tech and process size with a given
voltage will require X amount of current per amount of
*circuitry*, or in a logical sense, limiting circuitry or
stable attainable speed within the aforementioned
limitations (given same tech) limits performance (though
optimizable for certain tasks the more one specializes in
specific function CPUs instead of broader operations like in
a PC).

Capacity is processes per second.
Thanks for that, Kony, jack
 
R

Radium

My dream PC is as hardware, real-time, and digital as possible. In
addition, it uses the least amount of buffering required [hopefully
none] and experiences the least amount of latency possible [again,
hopefully none].
This may be your dream but it may also be a nightmare. You also said
you wanted low power and no fan. If you want a lot of speed, you
really want a good cooling system and a whole lot of power. If you
want to make a faster system with low power, you want to make use of
things like lookup tables and hashes.
Couldn't the problem of excessive heat and large use of power be
solved [or at least mitigated] by using lower voltages while still
running things in real-time [and with the least amount of storage,
software, buffering, and latency possible] and not using fans?

No. It largely can't be avoided. If you have to do your sine function
from first principles and you want speed, you need a huge number of
operations per second.

Do you think the heat generated and power requirements will decrease
when photonic chips are available?

AFAIK, photonic circuits produce less heat than electric circuits.
However I am aware that even when photonics becomes the norm [i.e. if
is does], electricity will still be necessary for power supply.
 
R

Rob Warnock

+---------------
| (e-mail address removed) wrote:
| >>>It's current that causes heat rise (I^2R).
| >>... and voltage (accuracy of voltage control method), and frequency.
| >But isn't it the curent that does the heating?
|
| No, current only flows based upon the difference in
| potential voltage. It is a result not cause.
+---------------

No, it takes *both* current & voltage to get heat!!

Look, it's like any other mechanical potential -- gravity,
springs, whatever. You have potential energy -- a difference
in height, compression of a spring, voltage between two points
in a circuit -- and then you have *work* (actual energy consumed),
which is the exertion of a force across some "distance" of potential
energy. When a rock falls off a shelf, its potential energy of
position is converted first into kinetic energy (increasing speed
as it falls), and then into thermal energy when it hits the floor.
When the rock *slides* at a near-constant speed down a rouch ramp,
the same total thermal energy is released -- it's just that the
conversion from potential energy to thermal energy is more continuous.

Likewise, when an electron is sitting on one side of a charged
capacitor, it's not doing any work -- its energy is all potential.
It is not until the electron flows "downhill" from the higher
(negative) potential to the lower that its potential energy
is converted to heat (in a series of very many small increases
of kinetic energy followed immediately by conversion to thermal
energy as it bounces around inside whatever conductor it's in).
"Current" is just a count of the number of electrons flowing
per unit time (Q/t), so the more current (I) is flowing across
a larger voltage (V) the greater the dissipated thermal energy (E),
that is, E = I * V * t. Or the more familiar P = I * V, if you
want power instead of total energy. But it takes *both* voltage
and current to get heat!

Frequency only comes into it if you're charging and discharging
capacitors. The number of electrons (in Coulombs) or "mass" entrained
in a capacitor is C * V, and the voltage is V, of course, so the
total available potential energy ("mass" * "height") is C * (V^2)
[or 0.5*C*V^2, I forget]. Charging the cap stores energy in the cap
[*and* dissipates energy in the wires leading to the cap]; discharging
a cap dissipates energy in the wires leading away from the cap [due
to the current (charge/second) flowing across a potential voltage].
One full change/discharge cycle dissipates some specific amount of
energy [never mind the exact constants], which is roughly linearly
proportional to the value of the capacitance times the square of
the voltage. Since each cycle dissipates a fixed amount of energy,
the more cycles per second you have, the more *power* (energy/second)
you dissipate. Q.E.D.


-Rob
 
K

kony

+---------------
| (e-mail address removed) wrote:
| >>>It's current that causes heat rise (I^2R).
| >>... and voltage (accuracy of voltage control method), and frequency.
| >But isn't it the curent that does the heating?
|
| No, current only flows based upon the difference in
| potential voltage. It is a result not cause.
+---------------

No, it takes *both* current & voltage to get heat!!

Do we really have to do this Electricity 101 nonsense while
you get up to speed with what you didn't understand?

Yes both current and voltage are *present* but this is a
discussion about changing parameters.

The voltage causes the current, and the current does create
heat, but "does the heating" is a vague concept that implies
an active rather than passive role. The poster was looking
for a distinction when there was a way to change the heat
level and that is not a scenario where we can just impose
current limiting.

There is no active role in the current, it is only following
the input voltage. Even with the inductors in typical
motherboard supply circuits, there's voltage feedback so
even then passive current control is only a means to a
target voltage.

Remember, the poster was trying to control heat and thinking
"focus on one or the other", essentially. So the entire
point was to differentiate between both voltage and current
even though both are of course related.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top