AMD planning 45nm 12-Core 'Istanbul' Processor ?

G

Gary Colligan

krw said:
What architecture? You grossly underestimate the x86 inertia.


Intel did too, but had no interest in pushing it forward to product.

They did, but was told by MS to dump it & use AMD64 code
MS did not want to make 2 ver. of Windows.



---
avast! Antivirus: Outbound message clean.
Virus Database (VPS): 080429-0, 29/04/2008
Tested on: 29/04/2008 11:52:34 PM
avast! - copyright (c) 1988-2008 ALWIL Software.
http://www.avast.com
 
S

Scott Lurndal

Wes Newell said:
Funny, that's not how I recall it. Intel dropped their x86-64 bit plans
after trying to push it onto Microsoft, and Microsoft telling them to
shove off. I think this link will get more to the truth.

From a history perspective, the P7 circa 1996 was to be the 64-bit follow-on
to the ia32 architecture. Then Intel shifted gears and joined with HP
to merge the P7 with some stuff at HP, producing Itanium. Itanium _was_
intel's 64-bit story (with the 32-bit x86 support in the processor). However,
Merced was late and slow and AMD did x86_64 and Intel was forced to include
it.

scott
 
R

Robert Myers

MSFT has been seduced down a dead-end. Sic transit gloria mundi.

From your mouth to God's ear. Vista did it, as far as I'm
concerned. I was willing to pay my Bill Gates tax, no matter how much
I resented it. No more.

**Sigh** I *will* learn more about codecs and, if necessary, pay for
legal copies for Linux before I will pay one penny for Vista.

Robert.
 
K

krw

Funny, that's not how I recall it. Intel dropped their x86-64 bit plans
after trying to push it onto Microsoft, and Microsoft telling them to
shove off. I think this link will get more to the truth.

Only after it was clear that AMD64 was going to happen whatever
Intel did, did Intel try to get in front of the train (to derail
it). M$ didn't see that in their interest either.
 
W

Wes Newell

That was later. Intel had explored 64-bit extensions to X86 years
earlier. (How could they not? CPU's are their business.)

Of course they had. But they didn't want 64 bit to come out for x86 to
compete with Itanium. And by trying to protect it, they basically screwed
themselves. Only after AMD released theirs did they try to bring their x86
64 bit code out. MS had already done a version of windows for Itanium and
weren't going to play Intels games at their expense.
 
S

Scott Lurndal

Zootal said:
PIII obsolete? Hmm...trivia of the day - what microarchitecture did Intel
base the core on? And what microarchitecture does the core have little if
anything to do with?

Former: Netburst, IIRC.
Latter: Anything other than netburst?

scott
 
R

Robert Redelmeier

In comp.sys.ibm.pc.hardware.chips Zootal said:
PIII obsolete? Hmm...trivia of the day - what microarchitecture
did Intel base the core on? And what microarchitecture does the
core have little if anything to do with?


The Pentium III was little more than a Pentium II with the L2
cache chips integrated on-die and running faster. The P2 was
little more than a slot repackaging of the PentiumPro which was
a completely new effort for Intel having nothing in common with
the original Pentium and PentiumMMX.

In many ways the P4 has nothing in common with the P3 or core,
and looks much more like a dressed up, overclocked original Pentium.

-- Robert
 
R

Robert Myers

ROTFL!

You got things 180 degree reversed from the reality. The reality is that
making software development harder won't make better software product nor
will it influence software development cycles.

This is a relatively common misconception among those who're clueless about
sofware development that length of development cycles pre se has any
meaningful effect on final product quality.

You made an erroneous inference from what I wrote because you
seriously underestimated how little I think of software developers.
If software developers are *slowed down*, there will be less bad
software because there will be less software.

It didn't *have* to turn out this way, but it did, and *you* are part
of the problem because, apparently, you think you know how to write
good software using languages and tools currently in use.

I may be wrong. If you are writing in a language and using tools that
allow checking of your programs for formal correctness, and if you
actually use those tools, please accept my apologies. Otherwise, you
are just another member of the club of gunslingers that call
themselves software developers and talk big, probably because they've
spent too much time blowing people away in video games.
There is multitude of software
systemes where there are strong requirements of simultanous high quality
and short development cycles. And those requirements are met.

rotfflmao. The extra f is intentional, as I'm sure your humor is not.
Importance of
cycle time is conditional on actual development methodology employed.

Once again, if you are using tools and methods that practically no one
actually uses, please accept my apologies. If you are relying on your
own personal brilliance and rigor, or that of your colleagues, you are
deluding yourself.

Robert.
 
S

Sebastian Kaliszewski

Robert said:
You made an erroneous inference from what I wrote because you
seriously underestimated how little I think of software developers.

Who cares what some clueless newsgroup poster thinks.

If software developers are *slowed down*, there will be less bad
software because there will be less software.

It didn't *have* to turn out this way, but it did, and *you* are part
of the problem because, apparently, you think you know how to write
good software using languages and tools currently in use.

I may be wrong.

You are.

If you are writing in a language and using tools that
allow checking of your programs for formal correctness, and if you
actually use those tools, please accept my apologies. Otherwise, you
are just another member of the club of gunslingers that call
themselves software developers and talk big, probably because they've
spent too much time blowing people away in video games.

You're clueless about software develompent process. Contrary to you I know
how to do formal verification of a software and do it when it's *needed*.
The important point here it's *needed* exceptionaly rarely. Thats because
is that formal verification is:
1. very expensive
2. does not guarantee total correctenss -- it only reduces chances of error
and that reduction is in reality directly proportional to cost increase.

It makes sense only in life critical systems (where cost of any hard error
is measured in millions). There it's in fact the preferred way (and the
cheapest one, as Lockhead's experience shows). But in other situations it's
simply too expensive.

rotfflmao. The extra f is intentional, as I'm sure your humor is not.

Go buy a clue and stop making idiot of yourself publicly.

Once again, if you are using tools and methods that practically no one
actually uses, please accept my apologies. If you are relying on your
own personal brilliance and rigor, or that of your colleagues, you are
deluding yourself.

No, I'm relying on cost of software production vs cost's of errors times
times their expected probability of occurence during lifetime of the
software.
Making software immune to errors costing 1e+9$ with probability greater that
1-1e-4 (that is four nines) during 10 years lifetime should not cost more
than 1e+5$ (that's about 1 man-year spent on improving quality). If it
costs more it's time to cut corners and increase error probability reducing
production cost.
And even then it's pointless to run that software on hardware with fault
probability worse that that of software (i.e. not for home use, nor small
company -- your standard PC will die with probability well above 0.5 not
below 0.0001 within the timeframe (even if you're doing regular
professional system maitenence and exchange all parts on planed shedule).
Then all of that must be immune to operator errors -- also impossible in
home or small and middle sized business environment.

So what you propose if a horrendous waste of time and money. It makes no
sense whatsoever.


Sebastian Kaliszewski
 
R

Robert Myers

Robert Myers wrote:
Who cares what some clueless newsgroup poster thinks.
In my years on this planet, and, more recently in dealing with usenet
hotheads, I've learned not to rely on my own knowledge and experience
when I don't have to.

Even so:

1. I am not clueless.
2. The people I talk to are even less clueless than I am.

1. very expensive
2. does not guarantee total correctenss -- it only reduces chances of error

Given the choice between something with solid mathematical foundations
and something that relies exclusively on someone like you, I'd choose
the former every single time. As to the cost, when I said we'd be
better off with less software, I meant it, and I still mean it.

You don't have to rely on my "clueless" opinion. I've posted on the
subject of the costs of our current setup, with citations. I've had
public discussions with software development managers in critical
applications who lamented exactly as I did. Well he should have, a
software boner had just cost his high-profile publicly-traded company
big bucks. As it is, with an internet, *everything* is a critical
application.

Know Edsger Dikstra? Want to call him an idiot, too? Just to be
sure, I googled on your name to see if you, too, had won a Turing
Award. If so, google missed it.
No, I'm relying on cost of software production vs cost's of errors times
times their expected probability of occurence during lifetime of the
software.
Making software immune to errors costing 1e+9$ with probability greater that
1-1e-4 (that is four nines) during 10 years lifetime should not cost more
than 1e+5$ (that's about 1 man-year spent on improving quality). If it
costs more it's time to cut corners and increase error probability reducing
production cost.
And even then it's pointless to run that software on hardware with fault
probability worse that that of software (i.e. not for home use, nor small
company -- your standard PC will die with probability well above 0.5 not
below 0.0001 within the timeframe (even if you're doing regular
professional system maitenence and exchange all parts on planed shedule).
Then all of that must be immune to operator errors -- also impossible in
home or small and middle sized business environment.
You don't need to recite to me justification for the grubby economics
of software development. Everyone lives with it. The part of the
calculation you don't want to do is the risk that's pushed off onto
someone else in the indefinite future. If you were forced to buy
insurance against that risk, you'd damn well do what I've proposed.
As it is, you can afford to let it slide, because the risk-taker is
always the end user. That's how this shameful business is set up, and
I wish you'd quit bragging about how good you are at it. There are
good loan sharks, too.

Robert.
 
S

Sebastian Kaliszewski

Robert said:
In my years on this planet, and, more recently in dealing with usenet
hotheads, I've learned not to rely on my own knowledge and experience
when I don't have to.

Even so:

1. I am not clueless.

You're probably not clueless about theory. You're apparently (as you
demonstrated many times here) clueless about real life practice. In theory
there is no difference between theory and practice, but in practice there
is :)
2. The people I talk to are even less clueless than I am.

No one here knows who you talk to. You only talked about Edsger Dijkstra,
who Requeiscat in Pace for about 6 years. I presume you don't talk to dead
people, otherwise we should move to another newsgroup (sci.parapsychology
or something)
Given the choice between something with solid mathematical foundations
and something that relies exclusively on someone like you, I'd choose
the former every single time.

Now put price tags to that choices. For example $100000 vs $1000. Then the
choice is to have anything or have nothing.
As to the cost, when I said we'd be
better off with less software, I meant it, and I still mean it.

Yeah and there is no market for more than three computers.

If you're better off with that the throw your computer out. As there would
be no software to run in.

To have formal verification possible one has to have formal specification to
begin with. And that formal specification must be right and error free
itself. Unfortunatly for you and others like you, for majority of stuff out
there there is no formal specification and even worse there are no known
means to phrase it. The possibility to get something like more-or-less
formal specification is in aerospace industry and other guys dealing with
big dangerous stuff. And that specification is possible only for the core
process, not interface. Bad human interface has already cost hundreds or
even thousands of lives yet nobody knows how to define that.
You don't have to rely on my "clueless" opinion. I've posted on the
subject of the costs of our current setup, with citations. I've had
public discussions with software development managers in critical
applications who lamented exactly as I did.

Critical how? Life? Business? Mission? There is no just critical, one must
first quantize it.
Well he should have, a
software boner had just cost his high-profile publicly-traded company
big bucks.

Oh, terrible. Maybe even better is to create software police and shoot all
of them off. Hey, it's capitalism out there. If those boners make big money
with crap product go and make yours. Its going to be real good stuff not
crap, so it should sell and get all those idiots out of the market, right?
As it is, with an internet, *everything* is a critical
application.

Absolute nonsense.
Know Edsger Dikstra? Want to call him an idiot, too? Just to be
sure, I googled on your name to see if you, too, had won a Turing
Award. If so, google missed it.

Neither I have Nobel prize nor Fields medal. But what is has to do the theme
at hand? The practical reality?
Prof Dijkstra quickly escaped from his brief industry adventure back to
academia right in 60ties. Later on he even didn't use computers for long
time *he finally used mac just for web and email). He is perfect example of
great theoretician, like Enstein, Gauss and others. He is one of those
greats who create theoretical foundations but leave the practical stuff to
others.

You don't need to recite to me justification for the grubby economics
of software development. Everyone lives with it. The part of the
calculation you don't want to do is the risk that's pushed off onto
someone else in the indefinite future. If you were forced to buy
insurance against that risk, you'd damn well do what I've proposed.

Nope. It so happens I develop software for those who certainly can count,
estimate risk, etc. (i.e. financial institutions). They do accept the risk
and are perfectly aware of that. And they certainly do not want formal
verification for vast majority of stuff. They don't want that for very
simple reason -- money, as they don't want to waste it.
As it is, you can afford to let it slide, because the risk-taker is
always the end user.

Nope. It's now clear that you have no clue about reality and only spread
urban legend type misconceptions.
That's how this shameful business is set up, and
I wish you'd quit bragging about how good you are at it. There are
good loan sharks, too.

And now go, hide in your cave and live in your imaginations.

Sebastian Kaliszewski
 
R

Robert Myers

You're probably not clueless about theory. You're apparently (as you
demonstrated many times here) clueless about real life practice. In theory
there is no difference between theory and practice, but in practice there
is :)


No one here knows who you talk to. You only talked about Edsger Dijkstra,
who Requeiscat in Pace for about 6 years. I presume you don't talk to dead
people, otherwise we should move to another newsgroup (sci.parapsychology
or something)
If you're interested in computational theory, you find others who are
likewise interested. The list of names is rather short, so the same
names keep coming up over and over again. Software security and
reliability is a *big* concern right now. I'll just leave it at that.
Now put price tags to that choices. For example $100000 vs $1000. Then the
choice is to have anything or have nothing.


Yeah and there is no market for more than three computers.
This business has a long history of preposterous assessments by those
in the business, and you just made another. The Internet almost
certainly would not be what it has become.
If you're better off with that the throw your computer out. As there would
be no software to run in.
There'd be a lot *less* software, but there wouldn't be no software.
To have formal verification possible one has to have formal specification to
begin with. And that formal specification must be right and error free
itself. Unfortunatly for you and others like you, for majority of stuff out
there there is no formal specification and even worse there are no known
means to phrase it. The possibility to get something like more-or-less
formal specification is in aerospace industry and other guys dealing with
big dangerous stuff. And that specification is possible only for the core
process, not interface. Bad human interface has already cost hundreds or
even thousands of lives yet nobody knows how to define that.
That's the way the industry has developed. It is not inevitable.
Critical how? Life? Business? Mission? There is no just critical, one must
first quantize it.
It would be silly of me to try to invent disaster scenarios on Usenet
since there are already people burning big taxpayer bucks spinning
more elaborate yarns than I ever could.
Oh, terrible. Maybe even better is to create software police and shoot all
of them off. Hey, it's capitalism out there. If those boners make big money
with crap product go and make yours.

Those are clearly your values.
Its going to be real good stuff not
crap, so it should sell and get all those idiots out of the market, right?
The companies I admire the most make "really good crap." They're not
bad people, and I can only envy their competence. I just don't
approve of the risks being built into the enterprise, whether it's to
financial markets, individuals on the Internet, the military, or any
other place where the person to pay the price isn't the person who
"estimated" the risk.
Absolute nonsense.
I've already explained this.

http://www.packetstormsecurity.net/
Neither I have Nobel prize nor Fields medal. But what is has to do the theme
at hand? The practical reality?
Prof Dijkstra quickly escaped from his brief industry adventure back to
academia right in 60ties. Later on he even didn't use computers for long
time *he finally used mac just for web and email). He is perfect example of
great theoretician, like Enstein, Gauss and others. He is one of those
greats who create theoretical foundations but leave the practical stuff to
others.
Dijkstra, as you may know, was a big advocate of formal verification.
Know one knows what the world of software development would be like if
the world had followed his advice. Your estimates of costs are just
numbers pulled out of the air, because no one knows what the costs
would look like if the methodologies were widely used.
Nope. It so happens I develop software for those who certainly can count,
estimate risk, etc. (i.e. financial institutions).

hahahahahahahahaahahaha.

hahahahaahahahahahahaha.

hahahahahahaahahahahaha.

Long Term Capital Management.

Practically every player in financial markets today.

It's questionable, highly questionable, if they know even how to
estimate risk in a way that prevents catastrophic events or even
limits catastrophic events to what is theoretically possible.

It's true. We'll never know what role, if any, software glitches play
in creating chaos, because there are even bigger problems.

The mafia doesn't care about software quality, either, I'm sure.

You're in the right business, that's for sure.
They do accept the risk
and are perfectly aware of that.

No they don't. The taxpayers accept the risk. If you have a
bottomless backup for your Ponzi scheme, risk is only a matter of
appearances. What you really mean to say is that financial markets
know how to keep up appearances so that it will seem, when their
mistakes become obvious, that it isn't really their fault. Too bad
the rest of us can't live that way.

And they certainly do not want formal
verification for vast majority of stuff. They don't want that for very
simple reason -- money, as they don't want to waste it.
Time.


Nope. It's now clear that you have no clue about reality and only spread
urban legend type misconceptions.
That's so funny. IBM (International _Business_ Machines) invented the
whole idea. How could they get away with it?

They had something that others wanted so badly that others had to
accept it on the terms they offered, no matter that no other product
in the world has ever been sold that way. Microsoft picked up,
tightened up, and expanded the idea to ridiculous extremes. Now, if
your Real Player becomes a gateway for criminal activity, it's your
problem, not theirs. And that, sir, is the sense in which all
applications are critical applications.

It might be interesting to examine how this transfer of risk took
place. Doctors stumble under the cost of malpractice insurance.
Lawyers get sued by disgruntled clients. Manufacturers of real goods
get sued from here to kingdom come based sometimes on the most
preposterous theories.

Software developers? Here it is, buddy. Take it or leave it.

Robert.

Robert.
 
S

Sebastian Kaliszewski

Robert said:
If you're interested in computational theory, you find others who are
likewise interested. The list of names is rather short, so the same
names keep coming up over and over again. Software security and
reliability is a *big* concern right now. I'll just leave it at that.

But theoretical solutions work in theory. Practical ones, while based on
theory, implement that theory in reality. And the problem here is that
theory is severely lacking it this field.

This business has a long history of preposterous assessments by those
in the business, and you just made another.

ROTFL! You even didn't get the joke... That statement is not mine, it's 60
years old...
The Internet almost
certainly would not be what it has become.

That's for sure. It would look like it looked back in eighties at best.
There would be few universities and military institutions connected.

But funnily enough, there would be relatively little security concerns, as
there were 20 years ago.
There'd be a lot *less* software, but there wouldn't be no software.

There would be *no* software for personal computers. Period.
As there would be no operating system useable by average non-specialist.

That's the way the industry has developed. It is not inevitable.

It *is* inevitable. For example we don't have such basic stuff like
mathematical tools to describe important aspects of systems like
human-machine interaction.

Show me mathematical formula describing good user interface.

It would be silly of me to try to invent disaster scenarios on Usenet
since there are already people burning big taxpayer bucks spinning
more elaborate yarns than I ever could.

IOW you don't know what you're talking about.
Those are clearly your values.

I lived in socialism long enough to say f..k off to every idiot proposing
it. Or I'd rather send them for 1-2 years to live as normal citizen (not
honourable guest but plain normal citizen) in some socialistic country. It
straightens the view on many things very well.

The companies I admire the most make "really good crap." They're not
bad people, and I can only envy their competence. I just don't
approve of the risks being built into the enterprise, whether it's to
financial markets, individuals on the Internet, the military, or any
other place where the person to pay the price isn't the person who
"estimated" the risk.

Reality bites. It's simply, plain impossible to make only risk estimators
take the risk, and no one else. Or everyone would be just risk estimation
specialist, and there would no one to make actual product to begin with :)

It's an illogical utopia.

So what?

Funnily you show that link. It so happend that the first security advisory
there says: "Luciano Bello discovered that the random number generator in
Debian's openssl package is predictable."

That fits nicely here. You want to specify good random number generator and
implement that according to the specification. So show how to describe one
formally, as neither "good" nor "predictable" is useable.
Well, if you can there is 1 million $ lying out there :) As it so happens
that mere proof of existence of such good pseudorandom number generator
means nonuniform-P is a proper subclass (i.e. is different) of NP, and it's
a simple fact that P is a subclass (no one knows if its proper or not, but
it does not matter here) of nonuniform-P which would mean that P <> NP and
there is nice round $1 million waiting for one who proves (or disproves)
that :)

So far with formal verification of security :)

It's all based on beliefs of nonexistence of shortcuts, quality of PRNGs,
and such stuff. Those beliefs are supported by testing, but are not
formally proven.

Dijkstra, as you may know, was a big advocate of formal verification.

But he didn't suggest to stop making other software. He wanted to improve
methods as theory better covering things was developed.
Know one knows what the world of software development would be like if
the world had followed his advice.

World has no tools to follow his advice. His advice can be followed in
relatively simple, closed systems like mechanical device, or some process
controll logic.
Your estimates of costs are just
numbers pulled out of the air, because no one knows what the costs
would look like if the methodologies were widely used.

Nope. The difference would be small constant factor. The tools like SPARK
are out there, and while specialists who mastered them are rare and thus
their work must be well paid, this is just a small constant difference
(small means around 3).

It is only useable in systems where level of trust must be so high that
testing would be too costly. And it so happens that those systems are
posible to be specified well, as they are relatively simple and closed.
hahahahahahahahaahahaha.

hahahahaahahahahahahaha.

hahahahahahaahahahahaha.

Long Term Capital Management.

Practically every player in financial markets today.

You only shows your lack of grip in reality.
It's questionable, highly questionable, if they know even how to
estimate risk in a way that prevents catastrophic events or even
limits catastrophic events to what is theoretically possible.

You compare stuff which is by the very nature unpredictable (if someone
would predict it the one would influence it in unpredictable way) with
stable (unchanging) stuff with measurable properties.
It's true.

It's nonsense. A nonsense which could be produced by someone who do not
understands things one is talking about.
We'll never know what role, if any, software glitches play
in creating chaos, because there are even bigger problems.

The mafia doesn't care about software quality, either, I'm sure.

You're in the right business, that's for sure.

You're talking about things you're completely clueless about, that's for
sure.

No they don't. The taxpayers accept the risk.

Nonsense.
[rest of clueless babbling snipped]


Time resource like others. All resources cost money.

That's so funny. IBM (International _Business_ Machines) invented the
whole idea. How could they get away with it?
They had something that others wanted so badly that others had to
accept it on the terms they offered, no matter that no other product
in the world has ever been sold that way. Microsoft picked up,
tightened up, and expanded the idea to ridiculous extremes. Now, if
your Real Player

ROTFL! If only Real Player was a MS product...
becomes a gateway for criminal activity, it's your
problem, not theirs. And that, sir, is the sense in which all
applications are critical applications.

ROTFLMAO!
You don't know squat what is a critical application!

Or maybe hammer I can buy at any farmers shop is a critical device, as it
could be used for criminal activity.

It might be interesting to examine how this transfer of risk took
place. Doctors stumble under the cost of malpractice insurance.

Don't extrapolate your local problems to the rest of the world.
That's the main reason you have to pay 6 times more per stupid dentist visit
than I do. And it's not drugs and tools cost at all, as those are more
expensive here. But in our hospitals you wont find entire floor filled by
lawyers.
Lawyers get sued by disgruntled clients. Manufacturers of real goods
get sued from here to kingdom come based sometimes on the most
preposterous theories.

And who pays for all that? Thats you and all other customers.

Software developers? Here it is, buddy. Take it or leave it.

Exactly. Take it or leave it. I prefer that for other things as well.


Sebastian Kaliszewski
 
R

Robert Myers

Robert Myers wrote:

But theoretical solutions work in theory. Practical ones, while based on
theory, implement that theory in reality. And the problem here is that
theory is severely lacking it this field.
There's a chicken and egg problem. Almost no one pays attention to
the tools that do exist.

ROTFL! You even didn't get the joke... That statement is not mine, it's 60
years old...
Perhaps if you'd cited the "quote" correctly, I would have seen what
you thought you were about. Alas, if Thomas J. Watson ever said that
about _five_ computers, no one can find the source. It's most likely
urban legend:

http://en.wikipedia.org/wiki/Thomas_J._Watson

Nevertheless, the industry has been full of people who missed the
obvious.
That's for sure. It would look like it looked back in eighties at best.
There would be few universities and military institutions connected.

But funnily enough, there would be relatively little security concerns, as
there were 20 years ago.
I'm not certain I know what you're saying here. I think that, in the
Internet, we've created a monster. Everyone wants a piece of that
monster, though. Given that circumstance, I think we should start
regarding Real Player and other similar applications as "critical."
For example we don't have such basic stuff like
mathematical tools to describe important aspects of systems like
human-machine interaction.
That's a fair point. Maybe we should return to the days of the
mainframe mandarins in their glass houses, rather than having CPU's
everywhere. I've long been an advocate of thin, stateless client as
the safest computing platform for most users.
Show me mathematical formula describing good user interface.
The best that we can do at this point is to limit access based on
skill.

IOW you don't know what you're talking about.
You apparently aren't aware of what's going on in the US or how the US
does business in matters that affect national security. Could someone
in Western Pakistan disrupt the electrical grid, or worse? No bombs,
no airplane tickets, just smart people exploiting "non-critical"
software. Issues like that, when they exist, and they do, aren't left
to discussions on Usenet. If you want to continue thinking that I
don't know what I'm talking about, you go right ahead.
I lived in socialism long enough to say f..k off to every idiot proposing
it. Or I'd rather send them for 1-2 years to live as normal citizen (not
honourable guest but plain normal citizen) in some socialistic country. It
straightens the view on many things very well.
What has socialism got to do with this discussion? The US is a
regulated economy. I don't know of any unregulated economies
anywhere, unless it is dealing in drugs.

There are people in the US who are opposed to any regulation of any
kind. They are written off as nut cases. If I don't write you off as
a nut case, it's only because you are obviously coming from a very
different perspective. If you want to say that that's all very well
for the US, go right ahead. The US is the largest, most highly-
developed economy in the world, even if it isn't going to stay that
way.
Reality bites. It's simply, plain impossible to make only risk estimators
take the risk, and no one else. Or everyone would be just risk estimation
specialist, and there would no one to make actual product to begin with :)

It's an illogical utopia.
You're battling a straw man from the past. No one that I know of
believes in Utopia. There are many who are unhappy with the system of
tort claims in the US, but, in a bizarre sort of way, it works. Risk
is systematically being reassigned to those who actually create it.
Has it gone too far the other way? Maybe, but democracies work those
problems out in legislatures and the voting booth.

So what?

Funnily you show that link. It so happend that the first security advisory
there says: "Luciano Bello discovered that the random number generator in
Debian's openssl package is predictable."

That fits nicely here. You want to specify good random number generator and
implement that according to the specification. So show how to describe one
formally, as neither "good" nor "predictable" is useable.

I'm not an expert on random number generators and I suspect that
neither are you. It's a big problem, as are most assumptions about
the breakability of secure transmission schemes.

The point of offering the packetstorm link is to highlight the
regularity with which software vulnerabilities are discovered.
Well, if you can there is 1 million $ lying out there :) As it so happens
that mere proof of existence of such good pseudorandom number generator
means nonuniform-P is a proper subclass (i.e. is different) of NP, and it's
a simple fact that P is a subclass (no one knows if its proper or not, but
it does not matter here) of nonuniform-P which would mean that P <> NP and
there is nice round $1 million waiting for one who proves (or disproves)
that :)

So far with formal verification of security :)
It's a big problem. The potential money involved far exceeds $1
million.
It's all based on beliefs of nonexistence of shortcuts, quality of PRNGs,
and such stuff. Those beliefs are supported by testing, but are not
formally proven.
Good enough reason not to trust them. Maybe packet-switched networks
are not a good way to transmit "secure" data or maybe there has to be
an out of band (e.g. over telephone, preferably circuit-switched)
component.

But he didn't suggest to stop making other software. He wanted to improve
methods as theory better covering things was developed.
I'm not suggesting stopping software development. Neither have I
suggested that formal methods are a magic bullet. I claim that the
business at it is now is reckless for historical reasons and that it
doesn't have to be that way. I think that Dijkstra would agree.
Software can and should be much more transparent and less susceptible
to error. It can be done.

Nope. The difference would be small constant factor. The tools like SPARK
are out there, and while specialists who mastered them are rare and thus
their work must be well paid, this is just a small constant difference
(small means around 3).

It is only useable in systems where level of trust must be so high that
testing would be too costly. And it so happens that those systems are
posible to be specified well, as they are relatively simple and closed.
Speculating about what happens when you change the rules is a
fruitless enterprise. Consider what has happened with CISC vs. RISC.
Human beings can be very, very clever when pushed to it. No one has
pushed on this issue, or at least not enough.

You only shows your lack of grip in reality.
That's just wild.

Do you know the history of Long Term Capital management? Do you know
what's happened with derivatives all over the world? Do you know the
current parlous state of financial markets?

When you claimed that you were working with people who know how to
estimate risk, I almost fell off my chair. The failures have been
colossal, and they can all be traced to hubris about estimating risk,
or, in the case of derivatives, making the setup so complicated that
risk is, in practice, obscured from the buyer. In any other business,
that would be called fraud, except that the buyers in this case are
presumed to know what they're doing. Who are the buyers? Financial
institutions.

Derivatives are a good analogy for the problems of estimating risk
inherent in software. One problem is that, just as with software,
human beings are inevitably involved. Suppose there are _no_ buyers
for your security for reasons that your model didn't include? What do
your estimates of risk mean then? Suppose you own securities that are
worth something (probably), but no one knows what?
You compare stuff which is by the very nature unpredictable (if someone
would predict it the one would influence it in unpredictable way) with
stable (unchanging) stuff with measurable properties.
You are caught up in the overstated claims of financial wizards.
Models of risk are where it's at, and inexpensive computers have put
everyone into a business that hasn't even existed for all that long.
Your faith in the enterprise would be touching if you didn't keep
calling me an idiot because of things that you either don't understand
or do understand but don't want to acknowledge.

Time resource like others. All resources cost money.
Financial markets are a bit different, in that the game isn't so much
about intrinsic value as about what you think the other guy is going
to do. That makes time important in a way that the maxim "time is
money" doesn't capture.

ROTFL! If only Real Player was a MS product...
The shrink-wrap license was an innovation that Real Player didn't
invent. They, and every other software jockey has simply inherited
the idea and the courts bought it.
ROTFLMAO!
You don't know squat what is a critical application!
Go reread the conversation between Humpty Dumpty and Alice about
words. You mean that you use "critical application" in a different
way from what I do. The distinction you want to make is a fiction
that supports your business model. I don't have to support fictions
and I don't have to support your business model.
Or maybe hammer I can buy at any farmers shop is a critical device, as it
could be used for criminal activity.
A four pound hammer was at one time a fairly common carpenter's tool.
Repeated use causes elbow injury. If you hand one to a workman in the
US and he uses it and injures his elbow, you're going to have all
kinds of problems.

In every other field of human activity, the law puts greater and
greater pressure on manufacturers and vendors to foresee and to limit
risk to the end user. Not so in software, and we are already paying
the price.

And who pays for all that? Thats you and all other customers.
And yet we still get drugs that are improperly tested and/or sold
without full disclosure.

You're trying to defend an anomalous business practice that arose in
the US through a combination of heavy handed lobbying, some critical
court decisions, and the industry having a much better grasp of the
future than did lawyers. There is nothing about it that is natural or
inevitable. It can't be traced to Goedel's theorem, intractable
aspects of human nature, or some kind of cosmic struggle between
capitalism and socialism. It's bad policy, that's all. You come
along, years later, far from the action, and invent your own
mythology. Fine. Live long and prosper.

Robert.
 
S

Sebastian Kaliszewski

Robert said:
There's a chicken and egg problem. Almost no one pays attention to
the tools that do exist.

Theorists were allways the "protochicken", i.e. they are the ones to
develop theory. The problem here is that event theory is lacking.

ROTFL! You even didn't get the joke... That statement is not mine, it's
60 years old...
[...]
Nevertheless, the industry has been full of people who missed the
obvious.

Explicit explanation of the joke you didn't get:

It was joke about how "your way" would work.

I'm not certain I know what you're saying here. I think that, in the
Internet, we've created a monster. Everyone wants a piece of that
monster, though. Given that circumstance, I think we should start
regarding Real Player and other similar applications as "critical."

You're suggesting MS-style security patching, i.e. patch the endpoints, not
the core. Real Player newer was, is not and never will be a critical
application.

That's a fair point. Maybe we should return to the days of the
mainframe mandarins in their glass houses, rather than having CPU's
everywhere. I've long been an advocate of thin, stateless client as
the safest computing platform for most users.

Maybe we should stay in the caves and on the trees. Not inventing stuff like
weapons (which culminated in nuclear ones), tools which could be turn into
a weapon, etc -- in "must have been safer that way". Sure.

The best that we can do at this point is to limit access based on
skill.

Oh, so we should fire all those airplane pilots, as they apparently do not
have the skill to deal with faulty user interfaces (of the plane).

Nonperfect user interface is quite commonly one of the causes of air
disasters. Like the plane which crashed into a mountain in France some
years ago after pilot erronously set descent rate in degrees instead of
ft/s -- while 20 ft/s was ok, 20 degree nose down was not (3 would be about
right). While it was a human (pilot's) error, stupid user interface was a
contributing factor -- descent rate was set using the same knob as nose
pitch attitude and there was little selector switch on the side of the
knob, and tiny light indicator showing the setting but noting more
indicated that -- the value set, whereever it was ft/s or degree displayed
looked the same. If only they displayed degree in white, for example, and
ft/s in green, pilot would see that something doesn't look right. As they
were flying over mountains ground altitude meter (which was the mode of
altitude metering during the approach) was not indicative of the trouble as
it varied wildly.

Or faulty design in early DC-8 where plilots trying to just unlock reverse
thrust (just before landing) were actually turning it on while the plane
was still in air.

You apparently aren't aware of what's going on in the US or how the US
does business in matters that affect national security. Could someone
in Western Pakistan disrupt the electrical grid, or worse? No bombs,
no airplane tickets, just smart people exploiting "non-critical"
software.

You can disruopt national grid by use of 45ft aluminium folding ladder (you
could buy in farmer's store) and 1000 ft of nylon cable. Oh, and good
welders' glasses as multisecond 400kV 10kA discharge watched from 1000 ft
is dangerous for the eyes.

Your national grid is not running on RealPlayer, and machines are at least
firewalled properly.
Issues like that, when they exist, and they do, aren't left
to discussions on Usenet. If you want to continue thinking that I
don't know what I'm talking about, you go right ahead.

To succesfully crack into computer system and do particular damage one must
know it's structure, what software it runs etc. So one must get at least
close to company (or particular workers). So you need your air ticked.
And there is no guarantee you could easily do a synchronised action (and you
would need synchronised action at many points to bring down the grid). It's
much easier to buy few sets of 45ft ladder + 1000ft of nylon cable and set
out to fields.

What has socialism got to do with this discussion?

Whe I wrotre it's capitalism out here you starde with your values and stuff.
The US is a
regulated economy.

Of course, but not totally regulated. The market is generally free -- both
in US as well here in Europe.

[...]
I'm not an expert on random number generators and I suspect that
neither are you. It's a big problem, as are most assumptions about
the breakability of secure transmission schemes. [...]
It's a big problem. The potential money involved far exceeds $1
million.
It's all based on beliefs of nonexistence of shortcuts, quality of PRNGs,
and such stuff. Those beliefs are supported by testing, but are not
formally proven.
Good enough reason not to trust them. Maybe packet-switched networks
are not a good way to transmit "secure" data or maybe there has to be
an out of band (e.g. over telephone, preferably circuit-switched)
component.

So what, return to the caves?

[...]
Speculating about what happens when you change the rules is a
fruitless enterprise. Consider what has happened with CISC vs. RISC.

When low hanging fruits were picked and transistor budgets big enough it
simply does not matter anymore.

That's just wild.

Do you know the history of Long Term Capital management? Do you know
what's happened with derivatives all over the world? Do you know the
current parlous state of financial markets?

If you dont's see the difference between fincncial market (an unpredictable
system full of positive feedback loops, driven by agents working to
completely unknown rules) and complex, but closed and static product, we
heve nothing to talk about.

Derivatives are a good analogy for the problems of estimating risk
inherent in software.
Nope.

One problem is that, just as with software,
human beings are inevitably involved.

Oh. Like in all human activities humans are ivlolved. So picking fruits is
as good an analogy.

[...]
You are caught up in the overstated claims of financial wizards.
Models of risk are where it's at, and inexpensive computers have put
everyone into a business that hasn't even existed for all that long.
Your faith in the enterprise would be touching if you didn't keep
calling me an idiot because of things that you either don't understand
or do understand but don't want to acknowledge.

That only further demonstrates your lack of grip of reality of software
production. First if all, the risks in software are known. And there is a
well known upper limit as well as good estimates that problems will be
fixed in particular time. If one module of the software was made in 2 weeks
it can be remade in similar time in worst case scenario. This software
module is known to at least partially work (i.e. realise functions it was
actually tried),

Besides products of our company do have an warranty. We are obliged to fix
problems when they appear. If some function has been specified it must work
and if there is a problem it must be fixed in particular time.

[...]
The shrink-wrap license was an innovation that Real Player didn't
invent. They, and every other software jockey has simply inherited
the idea and the courts bought it.

As there was no other option for the price. You have just an licence to a
product which costs $$$$$$$. And you have that licence for $$$ or less.
Because it's only a license.
Go reread the conversation between Humpty Dumpty and Alice about
words. You mean that you use "critical application" in a different
way from what I do.

I mean it in widely understood way.
The distinction you want to make is a fiction
that supports your business model.

Nonsense. It's reality. If my software is a solitaire game it's not critical
at all. If it's a departamental email server software it's important but
not critical. If it's a accounting + document flow + store management + hr
management software package for mid size business it's business critical.
If it's a atomic reactor process controller it's life critical.
I don't have to support fictions
and I don't have to support your business model.

You don't have to support reality as well. Relaity doesn't care, neither do
I.

A four pound hammer was at one time a fairly common carpenter's tool.
Repeated use causes elbow injury. If you hand one to a workman in the
US and he uses it and injures his elbow, you're going to have all
kinds of problems.

Yeah,and you propose suing hammer producer... And that's a nonsense (even if
in US it's currently at least advisable to label the hammer with nonsense
stickers like not for children, don't use it to hit yor fingers, don't
leave it when working at height, etc.. its still pure absolute nonsense.

In every other field of human activity, the law puts greater and
greater pressure on manufacturers and vendors to foresee and to limit
risk to the end user.

Which is a nonsense. Hamemr is a hammer and it is dangerous if banged
against someones head.
Not so in software, and we are already paying
the price.

No, we get it cheper that way.

And yet we still get drugs that are improperly tested and/or sold
without full disclosure.

So? It only shows that all sides can't be covered. Even is such stuff like
drugs which is havily controlled (rightly so).
You're trying to defend an anomalous business practice that arose in
the US through a combination of heavy handed lobbying, some critical
court decisions, and the industry having a much better grasp of the
future than did lawyers.

Nonsense. It could be argued that this business practice is the back to the
roots, the right one. And that the policy to cover everything is a nonsense
and it is impossible to sustain (as groving prices for stuff like dentist
visits show -- you pay for the possible mistake and pay ways more than
expected value of the mistke cost times its probablity).
Fine. Live long and prosper.

Fine. Live long and prosper in your cave denying reality.


Sebastian Kaliszewski
 
K

krw

Robert Myers wrote:

Typical answer of leftist elitists.

Whe I wrotre it's capitalism out here you starde with your values and stuff.

But Robert knows what's good for everyone on the planet. *HE*
should be king!

So what, return to the caves?

That's exactly what Robert proposes, just like a good Marxist.

When low hanging fruits were picked and transistor budgets big enough it
simply does not matter anymore.

That is *exactly* the point the armchair architects miss. Though
x86 is a kludge, the cruft no longer matters. Transistors are free.
If you dont's see the difference between fincncial market (an unpredictable
system full of positive feedback loops, driven by agents working to
completely unknown rules) and complex, but closed and static product, we
heve nothing to talk about.

Not only unknown dynamics, but unknowable dynamics; Schroedinger's
cat on steroids.

Nonsense. It's reality. If my software is a solitaire game it's not critical
at all. If it's a departamental email server software it's important but
not critical. If it's a accounting + document flow + store management + hr
management software package for mid size business it's business critical.
If it's a atomic reactor process controller it's life critical.

I once had the task of identifying "critical software" for our
function. Not being an IT type, I had some problems defining
"critical". The IT director made a story about the site burning to
the ground with all the payroll records. Was the payroll
"critical". Nope, they'd just go to the bank and hand out money to
anyone with some sort of proof that they were employees. They could
balance the books later. We didn't have any critical records. ;-)
You don't have to support reality as well. Relaity doesn't care, neither do
I.



Yeah,and you propose suing hammer producer... And that's a nonsense (even if
in US it's currently at least advisable to label the hammer with nonsense
stickers like not for children, don't use it to hit yor fingers, don't
leave it when working at height, etc.. its still pure absolute nonsense.

"Not for internal use"

No, we get it cheper that way.

No, we *get* it that way.

<snip>
 
R

Robert Myers

Typical answer of leftist elitists.
You sound envious.
But Robert knows what's good for everyone on the planet. *HE*
should be king!
That's just beneath you. Have drink or smoke something. You seem
even more stressed than usual.
That's exactly what Robert proposes, just like a good Marxist.
Don't presume to speak for me.
That is *exactly* the point the armchair architects miss. Though
x86 is a kludge, the cruft no longer matters. Transistors are free.
That was *exactly* my point. The proposal was that software
development would go to hell in a handbasket if more stringent
standards were applied. The CISC problem has been worked to an extent
that no one foresaw, and software verification could similarly be
worked to an extent that neither of you foresees.
Not only unknown dynamics, but unknowable dynamics; Schroedinger's
cat on steroids.
Look. The bald statement was made that financial institutions know
how to estimate risk. Given the moment that the claim is being made,
it's beyond ludicrous. It's like claiming that George Bush knows how
to run a war.
I once had the task of identifying "critical software" for our
function. Not being an IT type, I had some problems defining
"critical". The IT director made a story about the site burning to
the ground with all the payroll records. Was the payroll
"critical". Nope, they'd just go to the bank and hand out money to
anyone with some sort of proof that they were employees. They could
balance the books later. We didn't have any critical records. ;-)
I think you make my point for me.
"Not for internal use"
You can act silly in every way you want. In every field of commerce
*except* software development, it's getting harder and harder to lay
risk off onto the end user. That's the direction that *capitalism*
has taken. These issues are settled in the courts and legislatures,
not in Usenet rants. Sooner or later, laissez-faire software
development will be reigned in because the accumulated risks to
society of the system we have now are unacceptable. Calling me names
will change nothing.
No, we *get* it that way.
We get it, all right. People's identities and medical records are
stolen en masse, bank accounts are pilfered, and the Internet is home
to powerful botnets with unknowable levels of capability or
maliciousness of intent.

Robert.
 
K

krw

You sound envious.

Of a wannabe processor architect and elitist snob? Hardly.
That's just beneath you. Have drink or smoke something. You seem
even more stressed than usual.

Of course I'm beneath you. You constantly make it clear that
everyone is. Typical leftist elitism.
Don't presume to speak for me.

You're quite transparent, Robert.
That was *exactly* my point. The proposal was that software
development would go to hell in a handbasket if more stringent
standards were applied. The CISC problem has been worked to an extent
that no one foresaw, and software verification could similarly be
worked to an extent that neither of you foresees.

No, you proposed nothing of the kind. You're "proposals" are that
only you know what people "need", certainly more than they do
themselves. Nothing but what you "know" should be allowed. Again,
typical Marxist claptrap.
Look. The bald statement was made that financial institutions know
how to estimate risk. Given the moment that the claim is being made,
it's beyond ludicrous. It's like claiming that George Bush knows how
to run a war.

Look, you're an idiot. ...nothing new.
I think you make my point for me.

Hardly. I'm not the one claiming that there is only one way to
design software and that anything else shouldn't exist.
You can act silly in every way you want. In every field of commerce
*except* software development, it's getting harder and harder to lay
risk off onto the end user. That's the direction that *capitalism*
has taken. These issues are settled in the courts and legislatures,
not in Usenet rants. Sooner or later, laissez-faire software
development will be reigned in because the accumulated risks to
society of the system we have now are unacceptable. Calling me names
will change nothing.

You're simply saying that sooner or later the lawyers are going to
bankrupt everyone, why leave it to just the tobacco companies and
hospitals. ...and you think it's a good idea!
We get it, all right. People's identities and medical records are
stolen en masse, bank accounts are pilfered, and the Internet is home
to powerful botnets with unknowable levels of capability or
maliciousness of intent.

Little to do with the subject at hand. Such was happening long
before your "shoddy" software design processes.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top