Motivation of software professionals

  • Thread starter Stefan Kiryazov
  • Start date
N

Nick Keighley

the phone line wasn't good. As well as the Alpha, Bravo, Charlie stuff
it seemed to help if you spoke like a boxing referee "one-a, two-a,
three-a"

That guy was pretty techie, because it contradicts my experience.
For example one guy was not capable to just find and change
parameter in config file, and I mailed him that.
Problem was that config file had about 1000 lines and
he didn;t knew how to apply case insensitive search.
In my experince lot of them don;t even know what is text
editor let alone debugger. So that guy was techie for, sure....

my mum had problems with instructions dictated by some some sort of
technical support because her interpretation of the terms "forward-
slash" and "backward-slash" were the exact opposite of most
(technical) people's
 
N

Nick Keighley

Lew said:
Richard Heathfield wrote:
People aren't usually stupid, and if they're highly motivated to solve a
[problem]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~­~~~
I think that this is most important factor. Customer support where
I live is not very well motivated...

Other guy that claimed C/C++ expert (worked in Portugal for
military) asked did C have function pointer. When we
showed question from Stroustrups book (duffs device),
he dind;t know what is all about.

well it looks pretty weird the first time you see it. "can you really
do /that/ in a switch?!" was my reaction. Is Duff's device important?

No one of fifty computer
scientist masters dind't recognized duffs device (even guy which
was borne in 63').

'63 was a good year? Programmers are like wines?

So I concluded if I see university diploma "master of computer scinece"
that's sure sine of ignorance or something similar in country where I live.

I didn't learn Duff's device at university.

One guy who claimed wrote sw for robots dind;t knew how much is 2^32 ;)

nor do I if you want the exact value. I'd look it up if I needed it (I
just use hex!)
I think that are very few people who know ho to program computers these
days.

"The Earth is degenerating these days. Bribery and corruption abound.
Children no longer mind their parents ... and it is evident that the
end of
the world is fast approaching."
-- Assyrian stone tablet, c.2800bc

Blame educations system, because "C is not safe" and
"stay away from assembler". Soon no one will know how to program,
and older guys will earn lot of money , but there would be not enough of
them...

sounds good to me!
 
N

Nick Keighley

Nick Keighley wrote:



What is the point?
Average Joe makes memory leaks in Java no problem...
these days...
Software gets more bloated, more and more bugs, ...

I was noting the fixed point in the human experience. Things are
degenerating and were always better in the past.

<snip>
 
R

Richard Bos

Lew said:
You say that like the developers were at fault. I cannot tell you how many
times I've seen management overrule developers who wanted to make things
right. It's been the overwhelming majority, though. I recall a manager in
1982 refusing to let a team fix the Y2K bug in the project.

I've seen that - _my_ manager, in _my_ fix in _my_ program - in 1995.
Three years later he thought that it would be a good idea for me to
start paying attention to this Y2K thing he'd just heard about.

And then there's the users. Don't get me started on the users.

Richard
 
R

Richard Bos

Flash Gordon said:
I know there is software flying around today that is running on Z80
processors (well, the military variant of them) and the plan in the late
90s was for it to continue for another 20 years (I don't know the
details, but a customer signed off on some form of ongoing support
contract). Admittedly the software I used was not doing date processing
(apart from the test rigs, which used the date on printouts, which I
tested to "destruction" which turned out to be 2028).

Single signed byte?

Richard
 
L

Lew

Richard said:
I've seen that - _my_ manager, in _my_ fix in _my_ program - in 1995.
Three years later he thought that it would be a good idea for me to
start paying attention to this Y2K thing he'd just heard about.

And then there's the users. Don't get me started on the users.

Yeah. Our jobs would be so much easier if we only didn't have customers!

Don't dis the customers, man. Having a derogatory attitude toward "users"
(there are only two industries that call their customers "users") is a major
arrogance. Shame on you.
 
L

Lew

Branimir said:
One guy who claimed wrote sw for robots dind;t knew how much is 2^32 ;)

Actually, a correct answer to that is "2^32". So you gave him the answer in
the question.

Another correct answer is "100000000 base 16".

If you are disparaging the guy for simply not knowing the expansion to decimal
digits, well, Albert Einstein didn't bother memorizing his home phone number
on the basis that he could simply look it up in the phone book on those rare
occasions when he needed it.
I think that are very few people who know ho to program computers these
days.

That's only a problem if those people who don't know how to program are paid
based on a claim that they do.

Unfortunately that happens a lot.
 
L

Lew

Nick said:
I was noting the fixed point in the human experience. Things are
degenerating and were always better in the past.

To paraphrase /Dilbert/: "Back in my day, we carved our bits out of wood."

The problem with those good old days is you had to measure memory in barqs
rather than bytes, and everyone knows that the barq is worse than the byte.
 
L

Lew

Branimir said:
To be honest things were always simpler in the past.

That's not honesty, that's nostalgia.

Which is simpler, dealing with rush-hour traffic or a dire wolf trying to eat
your child?

And human interactions are not observably simpler nor more complex than any
time since the evolution of /homo sapiens/.
 
B

BruceS

Nick Keighley wrote:


Well 4gb answer should be enough, I don;t know exact figure either ;)

Maybe I'm just being overly pedantic, but that seems like a bad
answer. I don't fault IT people for not knowing the powers of 2,
though the approximation of 2^10n to 10^3n makes it easy. I do fault
people who seem overly critical for not being precise.

Now *that* I can agree with, aside from taking issue with the "these
days" part. It seems to me that for most activities, the majority of
participants are not very competent, and this certainly includes
software development.
What is the point?
Average Joe makes memory leaks in Java no problem...
these days...
Software gets more bloated, more and more bugs, ...

Ditto. As a member of a very small niche, it's nice to set terms (to
an extent). I get all sorts of shiny trinkets to prove my value and
further inflate my already healthy ego.
Well, actually if you spend enough time lurking at usenet, you can
learn enough ;)

Just be sure to learn from the right folks, or you may well learn
wrong.
I don;t have objective picture since my perspective is
from this country where sw industry is practically non existent (btw).

Greets

My perspective is from a country where the sw industry is pretty
large, but there's still plenty wrong with it.
 
J

James Kanze

I guess it depends on which unixes, and which Linux. When I
went from SVR4 Unix to NetBSD, though, I had a LOT less
downtime.

I've never used NetBSD, but from what I understand, it does
seem like it would have been a lot better than Linux.

Note that the problem is more one of being new. And not having a
decent development process, but that problem was shared by many
commercial OS's as well. Up until the late 1990's, I used Sun
OS 4 professionally. Early Solaris wasn't that great, either.
The version I used (nvi) was nearly-rock-solid. Which is to
say, I found and reported a bug and it was fixed within a day.
And I've been using the same version of nvi that I was using
in 1994 ever since, and I have not encountered a single bug in
15 years.

The two aspects are probably connected. Stable software doesn't
change versions that offen.
I said gcc, not g++. And while, certainly, it has bugs, so
has every other compiler I've used. I had less trouble with
gcc than with sun cc. I used a commercial SVR4 which switched
to gcc because it was noticably more reliable than the SVR4
cc.

I believe that gcc was pretty stable by then. But by the early
1990's, we'd moved on the C++. I did one of the compiler
evaluations back then, and I can assure you that g++ was a real
joke.
I do not think it is likely that implying that anyone who
disagrees with you is being dishonest will lead to productive
discussion. My experiences with free software were apparently
different from yours -- or perhaps my experiences with
commercial software were different.

My experiences with commercial software are not universally
positive. But realistically, anytime before the mid-1990's,
most of the free software was simply not acceptable. It didn't
have a good enough process to ensure stability, and was too new
for most of the bugs to have been worked out.
Whatever the cause, the net result is that by the mid-90s, I
had a strong preference for free tools and operating systems,
because they had consistently been more reliable for me.

The turning point was some time in the mid-1990's. When
depending on what you were doing.
 
J

James Kanze

James said:
On Feb 14, 4:54 pm, Lew <[email protected]> wrote:
[...]
And I tried to use them, and they just didn't stop crashing.
Even today, Linux is only gradually approaching the level of the
Unixes back then.
I have to agree with you here. My earliest use of Linux was
1993, side by side with IRIX and SunOS. I don't remember
frequent crashing of Linux but there was no question but that
the UNIX systems were more stable, more polished and had more
capability. Granted, everyone back then was throwing Linux on
old PCs, which probably didn't help, but still...

Today, the problem is that everyone is throwing it on new
PC's:). Before the drivers for the latest cards are fully
stable. (Other than that, there still seem to be some problems
in XFree, and I've generally had to more or less hack some of
the boot scripts to get them to work.)

With the exception of the problems in XFree, however, I don't
think you can compare them with the commercial offerings.
Solaris always installed like a charm for me, but that was on a
Sun Sparc---the two were literally made for each other, and Sun
made sure that any new Sun hardware would work with Solaris.
Trying to cover generic hardware, including chips that haven't
been invented yet, is a lot more difficult.
 
J

James Kanze

I think it may be done occasionally. Certainly, if I had
contractual penalties for downtime, and my choices were
Windows or Linux, I'd run free software. :p

I'd use Windows XP before Linux, but frankly... I'd avoid
standard PC's completely: (modern) Solaris on a Sparc, or HP/UX
on HP's PA would be my choices. (Supposing I needed a real
general purpose system. Some of the embedded systems are
probably even more reliable.)
 
J

James Kanze

I've got mixed opinions on this. The real review takes place
off line. Explanation and discussion of possible solutions (I
know, a code walthru isn't supposed to consider solutions- a
daft idea if you ask me [1]) at a meeting.
Design meeetings can work.

It's always difficult to find the right balance, because people
do vary. What I think is certain is that you do need some time
isolated, to let your ideas jell, and you need some meetings, at
least at either end: brainstorming sessions before really
starting, and code reviews after the code has been written.
Between the two, different people probably have different needs.
A lot of people claim that they're most effective in pair
programming, for example, where as I (and some others I know)
would be much less effective if I couldn't do large parts of the
work more or less in isolation.
 
M

Martin Gregorie

I'd use Windows XP before Linux, but frankly... I'd avoid standard PC's
completely: (modern) Solaris on a Sparc, or HP/UX on HP's PA would be my
choices. (Supposing I needed a real general purpose system. Some of
the embedded systems are probably even more reliable.)

If uptime is the main criterion, your only options are fault tolerant
systems. Off the shelf that means Stratus or Tandem (now HP) Guardian
NonStop systems. This is the kit you find running telcos, inter-bank
networks, ATM networks, etc.

That apart, the most reliable system I've used is Microware's OS-9/68k, a
modular real-time OS. I've been running it almost every day since 1992
and have never found a bug in the system software despite applying Y2K
patches to it. I haven't replaced any hardware since 1993 either: not
even disks.
 
M

Mike Schilling

Branimir said:
To be honest things were always simpler in the past.

Compare disks which ae formatted for a particular record size (requiring the
inter-record gap to be figured into the record/track calculation) with disks
that have fixed sector sizes, Which is simpler?
 
M

Malcolm McLean

Which is simpler, dealing with rush-hour traffic or a dire wolf trying toeat
your child?

And human interactions are not observably simpler nor more complex than any
time since the evolution of /homo sapiens/.
Actually early humans seem to have had slightly bigger brains than us.
Evolution is now making brains smaller rather than larger. Welfare
Queens are better adapted to their environment than blue-stockinged
ladies with philosophy degrees.
 
F

Flash Gordon

Richard said:
Quite possibly. Not every problem ending in 8 is a 2038 problem. If the
test rigs had 1900 as a base date (and yes, there's still plenty of
software around that thinks 1900 was a very good year), then the single
signed byte Richard Bos mentioned would be good for representing all
years from then until 2027 (assuming 8 bits to the byte). It would fail
in 2028, quite possibly giving the year as 1772 instead.

It was indeed 2028 when it fell over. I can't remember all of the exact
details, but it was the HP Pascal system, which was based on UCSD
(IIRC). I think the data structure was either defined as being 0..99 or
0..127, and it definitely hit a problem when it rolled over to 2028, but
I can't remember the exact details and don't have access to the systems
any more (I work for a different company).

I suspect it could have been the date encoded in to a 16 bit word as
7 bits - year
4 bits - month
5 bits - day

I did clearly document the date of failure when I was asked to look in
to Y2K, but of course that documentation will be lost before then! I
also documented that the simple work-around would be to set the date
wrong and just write on the printouts the correct date!
 
J

John B. Matthews

[...]
It was indeed 2028 when it fell over. I can't remember all of the
exact details, but it was the HP Pascal system, which was based on
UCSD (IIRC). I think the data structure was either defined as being
0..99 or 0..127, and it definitely hit a problem when it rolled over
to 2028, but I can't remember the exact details and don't have access
to the systems any more (I work for a different company).

I suspect it could have been the date encoded in to a 16 bit word as
7 bits - year
4 bits - month
5 bits - day

I did clearly document the date of failure when I was asked to look
in to Y2K, but of course that documentation will be lost before then!
I also documented that the simple work-around would be to set the
date wrong and just write on the printouts the correct date!

For reference, UCDSD Pascal I.5/II.0/III.0:

daterec = packed record
month: 0..12; { 0 IMPLIES DATE NOT MEANINGFUL }
day: 0..31; { DAY OF MONTH }
year: 0..100 { 100 IS TEMP DISK FLAG }
end { DATEREC } ;

<http://invent.ucsd.edu/technology/cases/1995-prior/SD1991-807.shtml>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top