ARM-based desktop computer ? (Hybrid computers ?: Low + High performance ;))

S

Skybuck Flying

Hello,

Today Apple "released" the iPhone 4.0... I believe it has something like a
1.0 GHz processor...

I find that quite impressive, 1.0 GHz in such a small package and
non-overheating ???

Maybe to good to be true ?

I wonder what the future will bring ?...

Will we see the rise of "low power/low heat/low noise desktop computers"
being powered by ARM-based processors ?

Is this the end of Windows because it doesn't work on ARM processors ?

Can intel atom processors compete with ARM processors ?

What's AMD's answer to atom and arm ?

Can an AMD/Intel single 1.0 to 2.0 GHz core be compared to ARM 1.0 to 2.0
Ghz Core ? Would they both be about as fast... or would one win over the
other ?

To me 1.0 to 2.0 GHz seems to be the magical number/milestone/border/hurdle
towards a good to great desktop experience.

For 99.9% of my daily PC activity 1.0 to 2.0 GHz would be enough... this
almost includes video processing at modest resolutions 640x480 or so...
maybe 800x600, maybe even 1024x768... further enhancements/optimizations
might enable very large resolutions too but don't count on it ;)

For 1920x1200... 4.0 GHz is probably needed to run smooth and cool
(strangely enough)... Or a really cool 2.0 GHz processor ;)

Only gaming does need stronger graphics cards and stronger cpu's to do
more...

However software/technology does advance so maybe I could be wrong an maybe
people will need more processing power... but I don't think so...

Therefore assuming all people need more processing power is a bit
dangerous...

A good secondary strategy is to focus on low power/low heat/low noise/weaker
processors to accomadate non-gaming related and non-high-performance
tasks/crowd ;)

I do want a low heat, low noise, low power computer, but I also want a
strong, high performant computer which can do heavy tasks.

I would love to have a computer which can be totally quite thanks to for
example a ARM processor or maybe even an ATOM processor.

I would also love it if the fans only go on when it's really needed like
gaming or maybe huge video's.

Thus I guess a system which can do both would be ideal for me.

My current PC is already able to do this a little bit:

AMD Dual Core Processor and NVIDIA 7900 GTX graphics card.

But these two technologies do not take it far enough.

The processor still needs a fan to spin.

The graphics card still needs a fan to spin.

The desktop still needs fans to be constantly on... <- This is the biggest
problem probably.

Therefore what is needed is:

1. A motherboard which can control the desktop fans and even shut them down.

2. Processors/Graphics cards which can do the same.

3. Special software which can regulate this or special hardware.

4. Debuggers to make sure no evil "shut fans down during heat" is in there
to kill hardware ;)

5. Temperature meters everywhere for safety...

6. Emergency shutdown in case of emergency/accidental overheat.

7. Fan spin up failure detection.

8. Maybe even blocked air flow detection.

9. Maybe even unacceptable noise detection and throttling of hardware to
reduce noise in return lower performance.

10. This would require microphones which might be too privacy-paranoya ;) So
not a good idea.

11. Maybe even build in temperature displays in/on the desktop case to show
constant temperature of hardware at different locations
in the case to feel "safe" :)

Ultimately HEAT is bad though... even for the high performance situation.

HEAT is unpleasant for the human beings... it can become to hot in summer.

Assuming HEAT can be expelled from CASE and not be a problem could be wrong
thing to do.

HEAT also leads to bigger fans on buildings which is bad too.

However...

In the winter HEAT can working as heating device... and the problem is less
big... it can actually be nice.

Therefore producing more HEAT in winter is more acceptable... unless melting
the polar caps is a bad idea ! ;) :)

And yup it could be bad... many countries facing floodings ! ;) :)

So maybe ultimately HEAT = BAD = EVIL.

Try to use materials and designs which give great processing power but no to
little heat ;)

New inventions are done all the time....

Are intel/amd/ati/nvidia up to the task ?

Or will ARM take the cookie and the cake ?! ;) :)

(Just some random thoughts of me on the 1.0 GHz in a tiny package ;) :):):)
There is even talk of 1.5 GHz in iphone 5.0 wow ! ;) :))

Please feel free to comment within the lines and fill in the blanks,
misconceptions, pipe-dreams, yes/no etc ;) :)

Bye,
Skybuck =D
 
S

Skybuck Flying

One problem which I see people mention is:

x86 software does not work on ARM...

A solution for this problem is the following (Not my idea, but some crazy
noob ?):

An x86 compiler which compiles x86 to ARM code.

It's a bit a crazy idea perhaps...

But x86 is a instruction set/asssembly language after all as well...

And languages can be ported/translated right ? ;)

Then for example Microsoft or the Users themselfes could do it.

Microsoft's Windows on ARM could detect that the executable being installed
or being tried to run is actually an x86 executable...

Windows then starts the x86-to-ARM compiler and compiles the x86 binary to
ARM binary... saves it and then runs it.

With arm it is possible to add additional co-processors so maybe
co-processors could handle some x86 specific tasks... for compatibility
sakes or so...

Maybe even an ARM/x86 hybrid ! LOL :)

Or how about the ultimate crazy shit:

PowerPC/ARM/x86/Motorola/ATI/Nvidia hybrid ?! LOL.

Take the best of all or so :)

Bye,
Skybuck =D
 
S

SteveH

Skybuck said:
11. Maybe even build in temperature displays in/on the desktop case
to show constant temperature of hardware at different locations
in the case to feel "safe" :)

You forgot:
12. Tosser protection. So when confronted by a complete ****ing idiot like
you, the computer refuses to start in the first place.
 
S

Skybuck Flying

Well my hardware from 2006 doesn't have all the features I would like for
example:

The motherboard has only one temperature sensor as far as I know ?

I would like to see this increased like so:

+-----------------------------------------------------------+
| |
| | | |
| | | |
| | PCI Express | Sensor D |
| | | |
| |
| Sensor A CPU |
| |
| NorthBridge |
| |
| Sensor B | | |
| | Memory | |
| | | |
| | | |
| Sensor F Sensor C |
| |
| |
+-----------------------------------------------------------+


The CPU should already have a sensor.

So I want temperature/heat sensors on all critical/heat producing parts.


Currently there is only a temperature reader inside the CPU and inside the
GPU.

I want temperature readings outside the CPU and outside the GPU to known if
the motherboard
is taking too much heat...

I also like to be able to see which parts of the motherboard are becoming
the hottest.

This could vary from situation to situation/case to case/cooler to cooler
etc.

Software can make a nice visualization of the motherboard to help the user
understand...

Or simply display the reading and provide a user manual that explains where
the sensors are.

The sensors could also be placed more systemmatically like so:

S1 S4 S7

S2 S5 S8

S3 S6 S9

Also my hardware from 2006 doesn't fully shutdown the fans ;) the spindle
just slowly...

Yet you say it has already be done... I doubt it... but if I am wrong...:

Does anybody know a motherboard that has 9 temperature sensors ? Maybe even
12 ? ;)

Also my AMD X2 3800+ Dual Core CPU definetly does not detect CPU spin-up
failure ! ;)

It was failing every day... I tried to put some oil in it... after many
weeks it suddenly started turning again at boot time...

I guess the oil finally got soaked up or so to the shaft ! ;) ? Or maybe
it's a bios failure that suddenly went away ?

Also I rather prefer not clunky big heatsinks... it's just heavy... risk of
breaking motherboard... and it don't look so nice... it might also
obstruct the airflow if it needs to scale up.... Big Clunky Heatsinks are
definetly a NO-NO for me ;) :) =D <- They are windscreens... windscreens are
evil inside a pc ;) :) I need all the wind I can get in my PC to cool it
down... unless I am in the desert or so which I am not (yet) lol :)

Well you have made some claims that some to even all if this has already be
done... I highly doubt that... but please do provide links to prove me wrong
;)

Lastly it's amazing to see how fast Apple has launched new products.... like
4 iphones in just 3 years ? Plus an iPad and maybe some PC like thingies...

Doesn't sound like much... but I think it is... it requires all of this
enginering of hardware and software... quite impressive ?!? But they
probably worked very well together with others to help them out... that's
probably quite impressive too :)

I do wonder what happened to Steve Jobs though... he so thin ?!? Did all
that WIFI give him cancer or so ?!? WOW ?! 570 wifi base stations he said
during his recent presentation ?!? Wow.... that can't be good me thinks ?
Can it ? :) Time will tell ;) :)

Bye,
Skybuck.
 
M

MooseFET

One problem which I see people mention is:

x86 software does not work on ARM...

A solution for this problem is the following (Not my idea, but some crazy
noob ?):

An x86 compiler which compiles x86 to ARM code.

Such programs already exist. It is a clever trick that is used to
make
fast simulations of the ARM on a PC. Doing it the other way also can
be done. It wouldn't be super fast but if you weren't trying to run
a complete Windows OS, it could be fast enough to be used.

Since the ARM can be had as a part of a FPGA, you could add extra
stuff to the standard ARM to make the process go a little faster.
 
A

Andrew Reilly

Such programs already exist. It is a clever trick that is used to make
fast simulations of the ARM on a PC. Doing it the other way also can be
done. It wouldn't be super fast but if you weren't trying to run a
complete Windows OS, it could be fast enough to be used.

Back in '87 or so I had an Acorn RISC "PC", which had an ARM-2, and a "PC
emulator". It simulated an 8088 and the PC's basic hardware well enough
that I was able to use it to run a "scientific" word processor to write
my undergraduate thesis. The "feel" was about as fast as an original
4.77MHz PC, but I didn't run any benchmarks. I'm fairly sure that it
would have been a straight interpreter: the machine didn't really have
enough RAM to be mucking about with JIT compilation. This on a chip with
no cache, no 16-bit memory operations, and which ran the processor clock
at 4MHz or 8MHz depending on whether the DRAM-fetch in progress at the
time was in-page or doing a row access...

I thought it was quite a spectacular achievement.

Cheers,
 
S

Skybuck Flying

Check out http://www.silentpcreview.com/ -- those guys are serious about
quiet computing.

Hmm... that's mosterd after the meal...

Computer hardware needs to be designed from the start for low heat/low noise
and so forth... :)
But seriously, yes, Apple's execution has been impressive -- and while I
don't think that much of the man personally, one has to give credit that a
large part of it is directly linked to Jobs.

He has gained some respect from me... he seems a more honest guy than I had
expected him to be... at least in his presentations.

However if the world turns into one big cancer infected place because of all
the mobile phones and wifi's and gsm's and so forth than nope :)

May he rott in hell then forever as well ;) :)
No, but he had a liver transplant last year. Takes the wind out of most
everyone for awhile...

What was wrong with his ex-liver ? Cancer from the wifi ? ;) :) What did he
do with his ex-liver ? Bottle it for memories ? :p***

Ain't he afraid of getting cancer from all that wifi ?

Bye,
Skybuck.
 
T

Torben Ægidius Mogensen

Back in '87 or so I had an Acorn RISC "PC", which had an ARM-2, and a "PC
emulator".

That must have been the Archimedes A310. The RISC PC did not come out
until some time in the 90's, and this used an ARM610.
It simulated an 8088 and the PC's basic hardware well enough
that I was able to use it to run a "scientific" word processor to write
my undergraduate thesis. The "feel" was about as fast as an original
4.77MHz PC, but I didn't run any benchmarks. I'm fairly sure that it
would have been a straight interpreter: the machine didn't really have
enough RAM to be mucking about with JIT compilation.

It was, indeed, an interpreter, but of the 80186 instruction set. File
transfers etc. were a lot faster than on a 4.77MHz PC, but a few things
were a bit slower. The overall speed was fine for running the
occasional DOS application (I used it mostly for games), but for serious
work, you would use native applications.
This on a chip with no cache, no 16-bit memory operations, and which
ran the processor clock at 4MHz or 8MHz depending on whether the
DRAM-fetch in progress at the time was in-page or doing a row
access...

I thought it was quite a spectacular achievement.

Indeed it was. Nowadays, you would use a JIT (similar to Digital's
fx!32), so the speed would be better. ARM uses arithmetic flags similar
to x86, so it is easier for ARM to emulate x86 efficiently than it is
for, say, MIPS to do so.

Torben
 
R

Robert Myers

I guarantee you that, whatever the potential health hazards posed by WiFi,
GSM, etc. may be, there are orders of magnitudes more lives saved by wireless
technology than lost due to it.

I don't know. How often do you drive around people who drive will
using a wireless gadget? I think I'd want to do some research before
making any guarantees.

Robert.
 
S

Skybuck Flying

Joel Koltner said:
I guarantee you that, whatever the potential health hazards posed by WiFi,
GSM, etc. may be, there are orders of magnitudes more lives saved by
wireless technology than lost due to it.

What kind of hog-wash is this ? :)

Bye,
Skybuck.
 
N

nik Simpson

[And, for crying out loud, Steve Jobs did *not* invent cellphones or
WiFi, and I don't know of any evidence to suggest that the
availability of the iPhone has increased cell-phone usage above what
it would have been if the iPhone had never existed. You really ought
to have a good reason to issue oaths of damnation against somebody!]

Given the well documented problems of the iPhone on AT&T's network, it
may have even reduced the number of calls, well completed calls anyway ;-)
 
S

Skybuck Flying

What kind of hog-wash is this ? :)

I suspect that Joel was referring to (e.g.) the number of lives which
have been saved, because somebody was able to call for help quickly on
a cellular telephone, rather than having to drive five miles down the
road to the nearest vandalized payphone. Getting help on the way one
or two minutes faster makes a big difference in the survival rates for
severe trauma, heart attacks, etc.[/QUOTE]

Ok valid points, but I prefer not to get into such situations in the first
place:

1. Accidents along the road, don't be on the road, don't be in car, or plane
or bus etc.

2. Stay healthy, eat healthy, breath healthy, be healthy.

Skybuck, before you go accusing WiFi and cellphones and wireless in
general of causing cancer, you really ought to do some actual
*research* on the subject, OK? Go look up the actual studied
published in the last five years, and see if there's any real
correlation between the use of these technologies, and the incidence
of cancer in their users.

I suspect that the energy in the wifi/gsm/wireless signals go through the
human body and might trigger DNA changes to certain cells/parts of the body.

The problem is that scientists probably can't scan the entire body for these
changes ?!

If they could scan for such changes then maybe they could prove that
wireless energy is indeed causing DNA changes and therefore could increase
the risk of cancer.
I realize that actually doing research (even second-hand) would take
time away from gaming... but you might find it enlightening enough to
be worthwhile.

I don't game as much as I used to... mostly because pirates like me have
been cut-off from online gaming ;) :)

And also maybe I grown out of it a bit ;)

Ofcourse I will play Doom 4, Quake 5, Crysis 2, Battlefield 3, Call of
Juarez 3 (maybe), Alien vs Predator 2 remake, Company of Hero's 4 (maybe
even the free online version) and Red Alert 4 ;) (Mostly/especially if they
have new graphics technology ;) and perhaps even sound technology ;))

Those games are manditory ! =D
[And, for crying out loud, Steve Jobs did *not* invent cellphones or
WiFi, and I don't know of any evidence to suggest that the
availability of the iPhone has increased cell-phone usage above what
it would have been if the iPhone had never existed. You really ought
to have a good reason to issue oaths of damnation against somebody!]

Steve Jobs is promoting iPhones/iPads/iPods/iMacs and what not... and
promoting all this wireless stuff... without even blinking about it or
thinking about it.
(And ofcourse integrating it into his products without even blinking or
thinking about it ;))

So seriously he is too blame as well...

He would go free out/to heaven if he did the opposite: People watch out for
those wireless signals ! ;)

But nope... none of that... so that makes him guilty in my book ! ;) :)

Perhaps his own "inventions" will take care of him for all of us...

Perhaps he is driving a nice drive-by-wire car and soon he will face death
by wireless energy ****ing up his CAR !

Such irony would be sweet ! =D LOL.

Bye,
Skybuck ;) =D
 
S

Skybuck Flying

nik Simpson said:
[And, for crying out loud, Steve Jobs did *not* invent cellphones or
WiFi, and I don't know of any evidence to suggest that the
availability of the iPhone has increased cell-phone usage above what
it would have been if the iPhone had never existed. You really ought
to have a good reason to issue oaths of damnation against somebody!]

Given the well documented problems of the iPhone on AT&T's network, it may
have even reduced the number of calls, well completed calls anyway ;-)

Yeah good point as well..

Bad wireless/mobile phone service might actually cause lives to be lost...

Instead of trying to get the damn mobile phone working, which ofcourse
fails...

One could have gone to the nearest real phone and get some decent service
and save lifes ! ;)

Bye,
Skybuck =D
 
S

Skybuck Flying

What I am worried about is that wifi/gsm/umts signals might become the
"asbest" of the 21th century.

Asbest is very dangerous and cancerous if only they had known better back in
those days it wouldn't have become such a major problem/plague.

It seems none-of-the-lessons of asbest have been learned by electronics
industry.

And no I do not feel guilty about downloading games of which I know that I
would have never bought them anyway...

Bye,
Skybuck.
 
R

Robert Myers

Yeah, you and plenty of other people.

It's been extensively researched; the results are generally somewhere between
"it seems quite harmless" and "pretty inconclusive, really hard to say."  So
while no one would suggest it's 100% certain that such low-energy EM waves are
harmless, it does seem pretty clear that if they do create harm, it's a very,
VERY small risk in the grand scheme of things.

At some point you have to decide if the conveniences of modern technologyare
worth the risk given science's best asesssment of what those risks are.  None
of your great-great-great grandparents was ever killed by driving down an
interstate highway too fast... although it wouldn't have been unheard of for
them to be killed from something as simple a relatively small cut on, say,
their foot while walking along a beach that then became infected and
eventually killed them.  But just as surely as they'd love to have had
penicillin -- thereby decreasing their risk of death -- they just as surely
would have liked automobiles, despite the well-known increase in the riskof
death from them (especially for young guys like *you*, Skybuck!).

Does it bother you to stand in front of a light bulb?  You're getting
*hundreds* of watts there at *many terahertz* after all... makes your WiFi
gear seem absolutely puny!

As a professor of chemistry at one of our lesser local institutions
pompously informed me, photons from cell phone radiation aren't strong
enough to break the relevant chemical bonds (like, gosh, I never would
have known from all that time studying physics), thus leading to
possible mutations. What apparently didn't occur to him is that
structural kinetics of proteins *could* be affected by the relatively
low energy but coherent radiation from wireless devices. It would be
really premature to conclude that there is no risk. People were dying
of bovine spongiform encephalopathy (mad cow disease) in significant
numbers by the time clinicians opened their minds wide enough to
accept prions as a cause.

I agree that the kind of wild speculation you are responding to is
unhelpful, but the history of environmental hazards to health is
littered with premature dismissals of potential risks.

Robert.
 
S

SteveH

Skybuck said:
I suspect that the energy in the wifi/gsm/wireless signals go through
the human body and might trigger DNA changes to certain cells/parts
of the body.
You must have been near some really high powered wifi then.
 
J

Jasen Betts

Is this the end of Windows because it doesn't work on ARM processors ?

wince.


--- news://freenews.netfront.net/ - complaints: (e-mail address removed) ---
 
M

Maddoctor

Skybuck Flying said:
Hello,

Today Apple "released" the iPhone 4.0... I believe it has something like a
1.0 GHz processor...

I find that quite impressive, 1.0 GHz in such a small package and
non-overheating ???

Maybe to good to be true ?

I wonder what the future will bring ?...

Will we see the rise of "low power/low heat/low noise desktop computers"
being powered by ARM-based processors ?

Is this the end of Windows because it doesn't work on ARM processors ?

Can intel atom processors compete with ARM processors ?

What's AMD's answer to atom and arm ?

Can an AMD/Intel single 1.0 to 2.0 GHz core be compared to ARM 1.0 to 2.0
Ghz Core ? Would they both be about as fast... or would one win over the
other ?

To me 1.0 to 2.0 GHz seems to be the magical
number/milestone/border/hurdle towards a good to great desktop experience.

For 99.9% of my daily PC activity 1.0 to 2.0 GHz would be enough... this
almost includes video processing at modest resolutions 640x480 or so...
maybe 800x600, maybe even 1024x768... further enhancements/optimizations
might enable very large resolutions too but don't count on it ;)

For 1920x1200... 4.0 GHz is probably needed to run smooth and cool
(strangely enough)... Or a really cool 2.0 GHz processor ;)

Only gaming does need stronger graphics cards and stronger cpu's to do
more...

However software/technology does advance so maybe I could be wrong an
maybe people will need more processing power... but I don't think so...

Therefore assuming all people need more processing power is a bit
dangerous...

A good secondary strategy is to focus on low power/low heat/low
noise/weaker processors to accomadate non-gaming related and
non-high-performance tasks/crowd ;)

I do want a low heat, low noise, low power computer, but I also want a
strong, high performant computer which can do heavy tasks.

I would love to have a computer which can be totally quite thanks to for
example a ARM processor or maybe even an ATOM processor.

I would also love it if the fans only go on when it's really needed like
gaming or maybe huge video's.

Thus I guess a system which can do both would be ideal for me.

My current PC is already able to do this a little bit:

AMD Dual Core Processor and NVIDIA 7900 GTX graphics card.

But these two technologies do not take it far enough.

The processor still needs a fan to spin.

The graphics card still needs a fan to spin.

The desktop still needs fans to be constantly on... <- This is the biggest
problem probably.

Therefore what is needed is:

1. A motherboard which can control the desktop fans and even shut them
down.

2. Processors/Graphics cards which can do the same.

3. Special software which can regulate this or special hardware.

4. Debuggers to make sure no evil "shut fans down during heat" is in there
to kill hardware ;)

5. Temperature meters everywhere for safety...

6. Emergency shutdown in case of emergency/accidental overheat.

7. Fan spin up failure detection.

8. Maybe even blocked air flow detection.

9. Maybe even unacceptable noise detection and throttling of hardware to
reduce noise in return lower performance.

10. This would require microphones which might be too privacy-paranoya ;)
So not a good idea.

11. Maybe even build in temperature displays in/on the desktop case to
show constant temperature of hardware at different locations
in the case to feel "safe" :)

Ultimately HEAT is bad though... even for the high performance situation.

HEAT is unpleasant for the human beings... it can become to hot in summer.

Assuming HEAT can be expelled from CASE and not be a problem could be
wrong thing to do.

HEAT also leads to bigger fans on buildings which is bad too.

However...

In the winter HEAT can working as heating device... and the problem is
less big... it can actually be nice.

Therefore producing more HEAT in winter is more acceptable... unless
melting the polar caps is a bad idea ! ;) :)

And yup it could be bad... many countries facing floodings ! ;) :)

So maybe ultimately HEAT = BAD = EVIL.

Try to use materials and designs which give great processing power but no
to little heat ;)

New inventions are done all the time....

Are intel/amd/ati/nvidia up to the task ?

Or will ARM take the cookie and the cake ?! ;) :)

(Just some random thoughts of me on the 1.0 GHz in a tiny package ;)
:):):) There is even talk of 1.5 GHz in iphone 5.0 wow ! ;) :))

Please feel free to comment within the lines and fill in the blanks,
misconceptions, pipe-dreams, yes/no etc ;) :)

Bye,
Skybuck =D

I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
processors especially family 10h core. The main cause was Dirk Meyer hates
Intel.



--- news://freenews.netfront.net/ - complaints: (e-mail address removed) ---
 
N

nik Simpson

I'm pretty sure AMD has allowed nVIDIA to integrate its GPU to AMD
processors especially family 10h core. The main cause was Dirk Meyer hates
Intel.

Pretty sure you are wrong there, AMD has it's own graphics business that
competes directly with NVidia, so making life easy for NVidia would come
under the "cutting of your nose to spite your face" category
 
M

Maddoctor

No, AMD believes competition is good. AMD wants to secure discreete GPUs
market by allowing nVIDIA to do its own APUs.
nik Simpson said:
Pretty sure you are wrong there, AMD has it's own graphics business that
competes directly with NVidia, so making life easy for NVidia would come
under the "cutting of your nose to spite your face" category



--- news://freenews.netfront.net/ - complaints: (e-mail address removed) ---
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top