Demand That 'Microsoft Sell No Code Before Its Time'

K

kurttrail

Robert said:
the list of viruses on a system by no means maps to security
vulnerabilities.

I didn't say it did. I was changing what was being looked at.

Windows is obviously a more attractive target than Linux or Mac. Which
OS has to be patched the most is really meaningless in the real world.
What makes the real difference is which OS is targeted and hit the most.
And in that respect Windows is the winner. It is targeted and hit the
most. And the resaon for that is the nearly total homogenity of Windows
on Consumer Desktop PCs.

Just another way MS's proven monopolistic actions have damaged the
Computer world. As a matter of the computer security of the world, MS
near Near Desktop PC Monopoly needs to be broken.
To take a trivial example, the melissa virus (remember that) was an
example of a virus that run absolutely rampant and didn't exploit one
single security hole.

It took advantage of some especially stupid design choices, sure and
no arguement there, but those are not security "holes".

- fwiw i'd argue that "especially stupid design choices" are even
worse than holes that appear as a result of a mistake

I agree with you there.

--
Peace!
Kurt
Self-anointed Moderator
microscum.pubic.windowsexp.gonorrhea
http://microscum.com/mscommunity
"Trustworthy Computing" is only another example of an Oxymoron!
"Produkt-Aktivierung macht frei"
 
D

Don Taylor

Robert Moir said:
All Things Mopar wrote:
And what would the 100 million users say when the price of Windows changed
to reflect the costs of the new engineering standards required of it?

When quality briefly became an interest of management a decade or
two ago, many companies were astonished to find out that producing
defect free products was actually CHEAPER than the build-and-fix
model.

While sitting inside a company I watched a project spend more on
trying to patch the product to keep it from sinking than they had
on doing the entire project only a couple of years earlier.
A very interesting article, but reflect on the differences in the
engineering philosophy and the costs of applying that to a big project such
as Windows.

Personal experience on two projects being done for about the same
task by people working inside the same company under the same
schedule pressure with about the same quality software people,
a handful of differences in the "charter" of each group ended
up with close to 100x better software quality by one of the
teams. And the other one, they were the ones who spent the
money trying to keep it from sinking.
Also we have to accept that the market drives the need for commercial
shrinkwrap software. You say you want perfection, we all do for sure, and
yet millions of people buy software that isn't perfect, and being 2nd to
market with a perfect product while someone with a product that is simply
"good enough" eats your lunch can kill your company stone dead.

Tell me, brutally honestly, has version X+1 of any software product
out there REALLY given you something you couldn't have likely better
lived without? Who convinced these people that "who cares if the
crap works or not but we MUST throw in a few dozen new features and
ship a new version every year or our company will implode."
I've _never_ been charged by Apple or Microsoft for a bug fix.

As far as I know everybody else pays for each new bugfix/rewrite
of Windows and MacOS.
 
A

All Things Mopar

Robert Moir commented thusly:
You _are_ aware that Microsoft already do this, right?

Yes, of course I'm aware of that, and I'm aware than any large
development shop will also do that. Yet, it has been my
unfortunate experience across all categories of software, not
just Windoze, that product cycles are getting shorter and
shorter, and in their frenzy to meet real or perceived
competition, developers rush their products to market
prematurely. So, I've simply stopped buying or upgraded to
V1.0 of /anything/ unless there is a really compelling reason
to do so, such as a feature that I really want or need.

Back to my buddies in Redmond, the sheer size of the M$ KB
confirms that the folks up there are well aware of their own
shortcomings, and it is obvious that they do buy and test a
wide variety of hardware and software.

Windoze private beta testing has gone from hundreds to
hundreds of thousands, power users as well as developers and
people who just like to live life on the edge and risk major
data loss. But, the multiplicity of hardware and software in
use combined with the real economic need to maintain backward
compatibility puts a real strain on the development team and
the QA testers.

Add to that, people who hate Bill Gates much more than I do
spend enormous effort trying to find security holes to exploit
as well as hundreds of thousands of hits per week on M$'s
corporate computer systems. Every now and then, they succeed
and take down the giant for hours, sometimes for a day or
more.

At some point, whether I personally like it or not, being a
realist, I think that M$ needs to pull the plug on backwards
compatibility and say "enough is enough, here is what you need
for our new toy". Perhaps that day will come with Longhorn.
Just the need to maintain the old 8.3 DOS file names
constrains even the modern 255 long file names to not include
any wildcard or other "special" DOS characters.

The biggie for Mom and Pop America is that they have no real
incentive to upgrade their application software if it does
what they need for the forseeable future, but they might be
put out-of-business if M$ stops backwards compatibility. But,
that day must eventually come. People can delay the inevitible
by just not buying a new system until either their current one
melts down or they want more performance or some new app they
want won't run on their old hardware or O/S.
 
A

All Things Mopar

Leythos commented thusly:
He's lost in his Car computers are better than PC computers
world, he's not wanting to see the LARGE difference between
a PC running an OS/Applications and a computer in a car
running a very limited form of an OS (and most of the
processing devices in a car don't even use an OS) and it's
very limited firmware.

You might want to get your head out of your ass and realize
that my major point is that it /is/ possible to build
extremely complex code, albeit under more controlled
circumstances, that performs near flawlesslessly under extreme
climatic conditions and operator abuse, lack of maintenence,
what have you.

It isn't that I think that car computers are better than PC
computers, just that they are far, far, far more reliable.

Seriously, would you put up with having to restart your car,
wait 5 minutes for the POST and restart, have to identify your
user account, and hope the crash or freeze goes away so you
can continue driving? Or maybe, you should just get mercifully
put to death in a firery car crash when you car pauses to call
home to verify your right to operate it. And, when was the
last time you needed to take your car or TV or DVD or cell
phone or diswasher or refrigerator or, or, or back to the
dealer because it crashed or froze up so often you couldn't
use it? Or, how many times for your consumer goods, cars as
well as everything else you own, have you downloaded and
installed hundreds of security patches per year for these
devices? Yeah, car theives are interested in exploiting
security holes in car security systems, and they occasionally
do, but it is becoming more and more rare.

It all adds up to what I've been saying for days, if you
expect excellence, you will get it. But, if you expect only
mediocrity, you shouldn't be surprise that /that/ is what you
get!
 
K

Kerry Brown

All Things Mopar said:
Leythos commented thusly:


You might want to get your head out of your ass and realize
that my major point is that it /is/ possible to build
extremely complex code, albeit under more controlled
circumstances, that performs near flawlesslessly under extreme
climatic conditions and operator abuse, lack of maintenence,
what have you.

It isn't that I think that car computers are better than PC
computers, just that they are far, far, far more reliable.

Seriously, would you put up with having to restart your car,
wait 5 minutes for the POST and restart, have to identify your
user account, and hope the crash or freeze goes away so you
can continue driving? Or maybe, you should just get mercifully
put to death in a firery car crash when you car pauses to call
home to verify your right to operate it. And, when was the
last time you needed to take your car or TV or DVD or cell
phone or diswasher or refrigerator or, or, or back to the
dealer because it crashed or froze up so often you couldn't
use it? Or, how many times for your consumer goods, cars as
well as everything else you own, have you downloaded and
installed hundreds of security patches per year for these
devices? Yeah, car theives are interested in exploiting
security holes in car security systems, and they occasionally
do, but it is becoming more and more rare.

It all adds up to what I've been saying for days, if you
expect excellence, you will get it. But, if you expect only
mediocrity, you shouldn't be surprise that /that/ is what you
get!

You really don't understand the difference do you? Computers in a car are in
a closed loop with known input parameters. Windows computers are in an open
loop with unknown input parameters. The methods of programming and testing
are totally different.

Kerry


Kerry
 
A

All Things Mopar

Don Taylor commented thusly:
When quality briefly became an interest of management a
decade or two ago, many companies were astonished to find
out that producing defect free products was actually
CHEAPER than the build-and-fix model.

At last, an intelligent response! It is /always/ cheaper to
produce quality than to produce crap /if/ you must pay for
your mistakes through warrenty or lost sales revenue. But,
software companies have been getting away with producing crap
for decades with little or no financial penalties.

Twenty years ago, I took a week-long class in a quality
improvement process that stressed "do it right the first
time", "error proofing", and "root cause analysis" of
problems, taught by world-renowned quality guru Phil Crosby.

My car company, that a few folks are having fun making sport
of, spends /billions/ per year on warrenty, customer
satisfaction fixes, and product liability lawsuits, all
stemming from piss poor quality. So do /all/ the car
companies. Yet, inexplicably, even the Asians can't completely
comprehend that quality begins during the design and
development stages and cannot be tested for later on after the
car or even the component is built. By the time you conduct QA
testing, it is already too late to economically fix the
problem(s), hence they often make it out the door and some
finance guy figures that it is cheaper to repair than to
redesign.

Finance people also regularly recommend settling lawsuits
rather than to redesign a product defect as that would be de
facto proof of guilt. An absolutely pathological example of
that was the Ford Pinto gas tank fires 30 years ago. After he
came to Chrysler and was no longer constrained by
nondisclosure agreements as he was when president of Ford, Lee
Iacocca said that he had a fix for the Pinto gas tank that
would have moved it to the forward side of the rear axle for
just $6/car, but was prevented from doing so by the lawyers!
WTF?!

And, so the saga goes on, and on, and on...
While sitting inside a company I watched a project spend
more on trying to patch the product to keep it from sinking
than they had on doing the entire project only a couple of
years earlier.



Personal experience on two projects being done for about
the same task by people working inside the same company
under the same schedule pressure with about the same
quality software people, a handful of differences in the
"charter" of each group ended up with close to 100x better
software quality by one of the teams. And the other one,
they were the ones who spent the money trying to keep it
from sinking.


Tell me, brutally honestly, has version X+1 of any software
product out there REALLY given you something you couldn't
have likely better lived without? Who convinced these
people that "who cares if the crap works or not but we MUST
throw in a few dozen new features and ship a new version
every year or our company will implode."

Occasionally, yes I have, but not all that often. In my youth
when I like to play with computers, I spent more time keeping
the bleeping thing running from illadvised upgrades than I did
using it. No more.

Can't speak to Apple, but I have been charged for a fix from M
$ simply because they didn't acknowledge a bug existed but
gave me a work-around which I had to accept.
As far as I know everybody else pays for each new
bugfix/rewrite of Windows and MacOS.

In the final analysis, customers pay for /everything/
concerned with the development of /any/ new product. Or, at
least they do until the company goes out of business.
 
B

Bob

| Robert Moir commented thusly:

| People by their very nature live the self-fulfilling prophecy... if
| they view themselves as productive human beings, other people
| will sense that and respect them for it.

It doesn't work that way with Carey Frisch :-(
 
L

Leythos

You might want to get your head out of your ass and realize
that my major point is that it /is/ possible to build
extremely complex code, albeit under more controlled
circumstances, that performs near flawlesslessly under extreme
climatic conditions and operator abuse, lack of maintenence,
what have you.

You just keep missing it entirely. You keep assuming that your car
computers are in some way complex devices, and they are not even close
to complex. Sure, they process real-time information, sure, they provide
real-time input and output processing, but, for all example of engine,
ride, window control, they are simple computing devices designed for an
exact and very specific task with on specific devices with no
interaction with user added devices.

You need to stop assuming that an embedded system is in any way like a
Workstation, it's not close, not in the same processing class, not even
with anywhere near as much code.

Yes, we all agree that it's possible to turn out perfect code, but in
the real work, on the scale you find in a Workstation, there has not
been an example of Perfect code since before 1975 that I've seen.

On the same note, I've also not seen perfect complex PLC controllers,
not seen were all PLC controller modules were perfect (the ones that
have to manipulate data by calculation)......

You've got to get over-yourself, a simple computing device in a
car/vehicle is just that - a simple computing device - with nowhere near
the same amount of code, no where near the same number of devices it
interfaces with, no where near the same number of applications running
on it.... Do you get the REAL WORLD PICTURE NOW?
 
C

cquirke (MVP Windows shell/user)

On Fri, 12 Aug 2005 10:07:17 -0500, All Things Mopar

You can look at things from the "needs" perspective, as in "consumers
expect software that is defect-free".

You can look at things from a "resource" perspective, as in "it is
impossible to create complex software that is defect-free".

And you can see the collision course that is coming.
Software isn't fundamentally any different than development of
any other commodity. The world's consumers have cleaned up
just about every "hard" commodity by choosing quality over
non-quality, and by choosing from cost-efficient producers
over inefficient producers.

Software is different, in a couple of ways:
- it is pure data, and can be duplicated free of cost
- it is 100% human invention
- it has 0% naturally-resillient properties

The first is the factor that everyone loves, and why industry is
scrambling over itself to be as "soft" as possible. It's the ultimate
product scalability jackpot; sell a massive number of units, get rich
quick. OTOH, the development costs are hard to recover if you sell
only a few units, which is why custom sware is often costly suckage.

The second factor is why software is so riddled with defects.

When you combine the two factors, you see why poor software quality is
not "punished" by market forces as it would be in "hard"
manufacturing. A defective hardware item has to fixed the hard way,
via a hard product recall and replacement. A vendor takes huge pain
when this happens, and if it happens a few times, the vendor will
typically not survive. But a software defect can be "fixed" simply by
making replacement code available for download.


Let's look at this from the "needs" perspective.

Initially, because software could be duplicated with no cost,
consumers tended to simply copy the programs they wanted to use,
without paying the vendor.

Software vendors trained consumers to pretend software was a durable
item with "hard" per-instalce value, that should be paid for as such.
In other words, we would pretend it was impossible to copy programs
for free, and would buy an instance of the software as if it were a
durable hardware item, such as a hard drive or processor.

Software vendors went further to deliberately break programs so that
they couldn't be copied free of cost, artificially nullifying one of
the natural advantages of software, so as to compel payment.

Generally, consumers have accepted this. We pay for a single instance
of software, and having paid for it, we expect it to work and be free
of defects (or at least, free of dangerous defects).

The first change from this model came from antivirus vendors. The
nature of that industry requires ongoing development to keep up with
new malware; in fact, the value of the product lies not so much in the
original program, as the availability of regular updates.

So it's not unreasonable for av vendors to sell on a "subscription"
basis, as in; buy a copy of the av program as if it was a permanent
"hard" item, and yet only be able to use it for a year before being
required to pay again. This was the start of "software as service".

The "service" model works like this; you accept that software is not,
in fact, a "hard" item of lasting value, but an ongoing quest to find
and fix defects. You pay for this post-sale defect-chasing as a
"service" for a set period, after which your software dies.

It's what I call "rental slavery".


Let's look at this from the "resource" perspective.

Human beings have an error rate, even when doing mindless repetitive
tasks. If I ask you to write a particular sentence 1000 times, the
chances border on the inevitable that you will make errors. This post
will contain several errors; some detectable via spelling checker,
others not, e.g. often I write "but" instead of "buy", etc.

When you make a "hard" item from natural materials, you can generally
count on the behavior of those materials. Your "creative" input is
limited to what you do with those materials; there's less scope to
screw up - e.g., you can design an axe badly, so that the head flies
off the handle, but the wooden handle with inherit the predictable
charactaristics of wood, and the steel head, those of steel.

Software is 100% human invention, so at the very least, you expect the
human error rate to permeate throughout software. If a flaky coder
makes one error in 100 lines, and a good coder one error in 1 000
lines, both will create bug-riddled 1 000 000 line programs.

Now a human brain can only hold so much, and modern software far
exceeds this capability. So multiple human brains have to co-operate
when creating this software, and new opportunities for errors arise at
the interface between brains - either between peers working on the
same code, or when a "black box" created by one is used by another.

Software engineering tried to formalize the process of creating code,
and especially between peer programmers and re-usability of "black
boxes". This is a trade-off against performance; today's PCs may have
(say) 1000 times the hardware resources of the original PC, and yet
may run equivalent software only 10 to 100 times as fast. The rest of
the gains are eaten up by "bloated" coding practices that are slower
to run, but quicker to develop - and crucially, more reliable too.


By now, programming has reached the point that we accept defects will
always be present. The question is how to apply that awareness.

One way is to just carry on as if there was no problem. Pretend that
natural "hard" scopes can be replaced with artificial ones made out of
code (e.g. no more cable needed to join a LAN, tune in via WiFi and
rely on code security to wall out the bad guys). Pretend that code
will do only what it is designed to do, so that it can be assumed safe
to handle incoming material ahead of the user's intent.

The other way is to recognise that you cannot build robust machines
out of flaky materials, and stop trying. Limit the risks that code is
allowed to take. Limit unsolicited exposure to external materials.
Use natural cover (e.g. the need for cables to join a LAN) instead of
holding up a soggy cardboard shield against whatever's out there.


If software vendors can persuade consumers to accept rental slavey in
place of durable ownership, then this meets the vendors' needs for
onging revenue to cover ongoing bug-fixing. It also assures future
revenues, which helps if you are already fully-grown and need to shift
value from "hi-growth" to "blue-chip" status.

As it is, if consumers think a new version of the product they already
own does not add value, they can simply use the existing version and
ignore the upgrade. No sale, no revenue. But if rental slavery was
in effect, the consumer would be obliged to swallow whatever upgrades
the vendor wanted to push, and wouldn't care as much as they'd be
paying anyway. When the new versions outgrow the old hardware,
they'be obliged to replace that too; a nice boost to the rest of IT.

Software is, in effect, a rare situation where it doesn't pay to be
too high quality. If there were no need for updates, there'd be no
need for an ongoing relationship between user and vendor. As soon as
there is an ongoing need for patches, such a relationship can be
compelled by the vendor. The worse the code, the more frequent the
updates, the tighter the vendor can make that dependency.

Currently, things are pretty tight as it is - we are expected to leave
our systems open to unsolicited code changes delivered automatically.
When we go to the vendor's site to download some patch we have to have
to stay safe, the vendor can require us to allow dropped ActiveX to
run, and we must jump through hoops to prove we are licensed users.

Then the code that we download (at our expense) can change settings,
re-assert UI elements, hijack file associations, import data from
other email programs and set themselves up as the default email
program instead, etc. Great vendor leverage, that is made possible by
the product being flaky enough to need patching.


So, given the choice between curbing features and software ambitions,
and SEing the users into a far weaker set of consumer expectations and
rights, is it any wonder things are as we have allowed them to be?


------------------ ----- ---- --- -- - - - -
The rights you save may be your own
 
A

All Things Mopar

Leythos commented thusly:
You just keep missing it entirely. You keep assuming that
your car computers are in some way complex devices, and
they are not even close to complex. Sure, they process
real-time information, sure, they provide real-time input
and output processing, but, for all example of engine,
ride, window control, they are simple computing devices
designed for an exact and very specific task with on
specific devices with no interaction with user added
devices.

You need to stop assuming that an embedded system is in any
way like a Workstation, it's not close, not in the same
processing class, not even with anywhere near as much code.

Yes, we all agree that it's possible to turn out perfect
code, but in the real work, on the scale you find in a
Workstation, there has not been an example of Perfect code
since before 1975 that I've seen.

On the same note, I've also not seen perfect complex PLC
controllers, not seen were all PLC controller modules were
perfect (the ones that have to manipulate data by
calculation)......

You've got to get over-yourself, a simple computing device
in a car/vehicle is just that - a simple computing device -
with nowhere near the same amount of code, no where near
the same number of devices it interfaces with, no where
near the same number of applications running on it.... Do
you get the REAL WORLD PICTURE NOW?

No, do you? From what I've been reading from you, I don't know
if you do or don't know anything about PCs but it is very
clear you don't know anything about cars or you wouldn't make
assinine remarks like "they are simple devices for a single
purpose", or words to that effect. If you think running an
/entire/ vehicle is a single purpose run by a simple device,
then you obviously have no clue as to what you speak about.

So, I'll just sign off and let you mull all this over the next
time, and the next 10,000 times you get in your car and wonder
why it all works with nary an update and never a need for a
reboot.

Ta, ta!
 
A

All Things Mopar

Bob commented thusly:
| Robert Moir commented thusly:

| People by their very nature live the self-fulfilling
| prophecy... if they view themselves as productive human
| beings, other people will sense that and respect them for
| it.

It doesn't work that way with Carey Frisch :-(
He/She/It doesn't seem to fit very much what one might refer to
as human behavior, so I guess I'd have to agree with you.
 
A

All Things Mopar

cquirke (MVP Windows shell/user) commented thusly:

I'll stand on my major point, people generally get what they
want, so if they want crap, they will get it, but if they demand
quality, they /can/ get it - if and only if there is true
competition, /and/ they don't fall off the wagon like a
recovering alcoholic and believe they can take just one drink,
meaning, accept just one more piece of crap software.

Having been both a developer and a supporter of software through
most of my work careers, I clearly understand the difference
between "soft" things and "hard" things, but software is the
/only/ industry in the world in which users happily part with
billions and billions of whatever their currency is to buy more
and more and more crap. Customers of "hard" things aren't nearly
that easy to fool - they buy from quality producers and the
purveyors of crap simply go out of business.

Now, can somebody tell me why Darwin's survival of the fittest
fails only on software, where /everybody/ should die but nobody
does, they just get wealthier?

If you can, you're a better man than me, Gunga Din!
 
L

Leythos

Leythos commented thusly:


No, do you? From what I've been reading from you, I don't know
if you do or don't know anything about PCs but it is very
clear you don't know anything about cars or you wouldn't make
assinine remarks like "they are simple devices for a single
purpose", or words to that effect. If you think running an
/entire/ vehicle is a single purpose run by a simple device,
then you obviously have no clue as to what you speak about.

So, I'll just sign off and let you mull all this over the next
time, and the next 10,000 times you get in your car and wonder
why it all works with nary an update and never a need for a

Funny, I've designed many plant control systems in the last 20 years,
almost 30 if you count when I was just starting, back when we designed
the PAL's and other chips and even had to make our own PCB. You think I
don't know what it takes to make a breaking control system or what it
takes to make a 5000HP motor run based on loading and surge conditions,
and it's nothing near as complex as a workstation.

What you seem to miss, and seem to happily want to remain ignorant
about, is that of all the computer in the car, the GPS system is
actually the closest to the Workstation PC, but it's not even that
close. I have had to have my Dodge Dakota Quad 4x4 updated twice since
I've owned it, and also have had a number of computer faults and sensor
faults - during that same ownership period I've not had a single fault
on my Quad P4 system or any of it's components.....

Please come back when you get some experience outside of the LIMITED
SCOPE of electronics that you have.
 
J

Jone Doe

What you seem to miss, and seem to happily want to remain ignorant
about, is that of all the computer in the car, the GPS system is
actually the closest to the Workstation PC, but it's not even that
close. I have had to have my Dodge Dakota Quad 4x4 updated twice since
I've owned it, and also have had a number of computer faults and sensor
faults - during that same ownership period I've not had a single fault
on my Quad P4 system or any of it's components.....

Please come back when you get some experience outside of the LIMITED
SCOPE of electronics that you have.

Agreed. Everytime the "check engine" light comes on and you take it back
in, what do they do? Replace various sensors and REBOOT.
 
C

cquirke (MVP Windows shell/user)

cquirke (MVP Windows shell/user) commented thusly:

(that URL pointed to a nice article, BTW - thanks)
I'll stand on my major point, people generally get what they
want, so if they want crap, they will get it, but if they demand
quality, they /can/ get it - if and only if there is true
competition

There's a scalability thing that works against this, and that is the
fixed developmental cost (to create software) that tends to accelerate
the success of the successful, and kill of anything that can't get
critical mass. This applies particularly to "big" software.

For example, I could envisage, code and test a small utility on my
own, in my spare time, and offer that either for sale, or as a
freebie. If no-one pays for it, it's a bummer, but I'm still around
and can always try writing another small project.

OTOH, when it comes to "big" stuff, such as an office suite, OS, web
browser, video compression codec, antivrus scanner (with ongoing
updates), etc. it soon gets too big for one person, even if they elect
to go into such development full-time.

Holding together a team takes some magic, usually money. You can and
do see alternatives to money as the cohesive force, but it's harder to
do, especially when not all the work is "fun". Hence quite a few
large free games, rather fewer free codecs.


When it comes to OSs, it's different because the OS is not the primary
point of choice, for one reason or another - one either gets the OS
that came with the system, or one chooses applications that one wants
to use and then the OS that's required for those programs.

What makes a platform great, is not the inherent quality of the
platform, but the mass of human effort that propells it. IBM's PC
wasn't technically the best (awful TV-incompatible graphics, dumb-ass
ROMs stuck in the middle of the memory map, etc.) but its open nature
attracted massive interest and development from hardware vendors - and
that's why it outlasted almost all of the dinosaurs from the tribal
home computer era, save for the Apple OS/hardware monopoly.

And that's what propells both Windows and Linux; both offer workable
and attractive space for developers to roost in.


One of the ways around the "too big for my head" problem, is code
re-usability, which allows programmers with deep understanding of a
market but shallow insight into computer tech to strap together "black
boxes" and thus create new, uniqure and reliable applications.

Well - more reliable than if they had to write all layers from scratch
in raw C, especially within the same time frame. But they can only be
as reliable as the black boxes themselves, and even that is best-case
that can be undermined by imperfect understanding between whoever
wrote the black box ("surely no-one would call this function without
sanity-checking the size of the data?") and whoever's (re-)using it.
Having been both a developer and a supporter of software through
most of my work careers, I clearly understand the difference
between "soft" things and "hard" things, but software is the
/only/ industry in the world in which users happily part with
billions and billions of whatever their currency is to buy more
and more and more crap. Customers of "hard" things aren't nearly
that easy to fool - they buy from quality producers and the
purveyors of crap simply go out of business.

I agree with you; the software vendors have had far too much ability
to write their own rules. An EUL"A" is a joke; who negotiates the
user's side of the "agreement"? The software industry has always
weasled itself out of responsibility, and today's commercial malware
EUL"A"s are only slightly exaggerated parodies of "normal" terms.
Now, can somebody tell me why Darwin's survival of the fittest
fails only on software, where /everybody/ should die but nobody
does, they just get wealthier?

Did you see Catch-22? "What's good for Milo is good for America"?

That's part of it, I'm sure. US is a dominant global economy, but as
a consumer in "the rest of the world", I ask myself: What is it I use
that is made in America? My car's local/Europe (usually local/Japan),
my computer hardware and consumer electronics are from Asia, my food
is local, my clothes are local or Asian... the only things I consume
that are made in the US are software, media, and maybe some overpriced
labels affixed to goods manufactured somewhere else.

So if US's main revenue magnet (from the rest of the world) is
software, media and brands, one can expect US government and society
to indulge predatory and abusive practices in these fields, as in "he
may be a big amoral bully, but he's *our* big amoral bully".

Here's the problem: Who speaks for the consumer of software,
especially in consumerland? Any attempt to counterbalance the big
players is invariably overrun by puppets of thier competitors, and I
don't feel safer with them - e.g. if forced to choose between
Microsoft, who wrested personal computing out of the jaws of the big
mainframe priests, and the likes of Sun, Oracle etc. who would have us
using dumb terminals and paying to run software off their servers.

Recent events have conveniently stampeded citizens towards a more
big-brother style of government, and that means less consumer activism
and more acceptance of curbs on various freedoms etc. So "the
gummint" is unlikely to kick ass on consumers' behalf.


------------------ ----- ---- --- -- - - - -
The rights you save may be your own
 
C

cquirke (MVP Windows shell/user)

Leythos commented thusly:
I'd say "watch this space" on that one - car IT is approaching levels
of complexity and interconnectedness that may well see it floundering
with the same sort of problems that have beset PCs for a few decades,
and mobile phones for a year or few.
You might want to get your head out of your ass and realize
that my major point is that it /is/ possible to build
extremely complex code, albeit under more controlled
circumstances, that performs near flawlesslessly under extreme
climatic conditions and operator abuse, lack of maintenence,
what have you.

Well, the "climate" of software is not rain, wind, heat and sleet;
it's contact with external material (as in, "all input is evil"). In
that sense, until recently, car software operated in an unchallenging,
hermetically-sealed climate. That's changing; watch the mileage.


--------------- ----- ---- --- -- - - -
Tech Support: The guys who follow the
'Parade of New Products' with a shovel.
 
C

cquirke (MVP Windows shell/user)

On Sat, 13 Aug 2005 20:11:39 -0500, All Things Mopar
No, do you? From what I've been reading from you, I don't know
if you do or don't know anything about PCs but it is very
clear you don't know anything about cars or you wouldn't make
assinine remarks like "they are simple devices for a single
purpose", or words to that effect. If you think running an
/entire/ vehicle is a single purpose run by a simple device,
then you obviously have no clue as to what you speak about.

Interesting, comparing cars and PCs.

My PC has a hard drive that's been spinning at 7200 RPM almost
non-stop for over a year now, and I expect it to do so for another 2-4
years. No taking it to the shop for "servicing", no "check the oil",
no "replace the parts that have worn out".

I'm pretty sure that if I fired up my car's engine, even without load
(i.e. left in neutral) and put a brick on the gas pedal so it sang at
7200 RPM, it likely wouldn't be still happily running a day later.

Why can't car makers build reliable mechanicals, like the IT industry?

I guess the answer is scalability and wear. PCs scale up computing
the way that hauling a large chunk of mass does for car engines.
Being exposed to millions of other entities via the 'net will "wear"
the load-bearing software surfaces faster than those of a car's IT.

Do you think a car's inbuilt IT doesn't have poorly-validated
parameter handling, etc.? It's hard to say either way, given how
difficult it is to inject arbitrary test data.

As it is, design flaws etc. do pervade even mission-critical
engineering such as air travel, as comp.risks will show.

What's impressive there, is the rigorous way that those who
investigate air safety approach these problems. I think the IT
industry could learn a great deal from them, and should.


------------------------ ---- --- -- - - - -
Forget http://cquirke.blogspot.com and check out a
better one at http://topicdrift.blogspot.com instead!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top