So you really think Linux is better than Vista, do you?

G

Guest

The whole point of an environment is code libraries or API calls. So if I
write to the environment and I would be allowed to make it public domain?
 
S

Stephan Rose

The whole point of an environment is code libraries or API calls. So if I
write to the environment and I would be allowed to make it public domain?

If by "write to the environment" you mean creating an application that runs
under linux sure you could make your code public domain. Nothing keeps you
from doing that.

Most API's these days are all LGPL licensed which gives you freedom to do
what your code as you please and removed the Open Source restrictions from
the regular GPL license making it suitable for any kind of development.

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

cquirke (MVP Windows shell/user)

I go by intuition. Example... last year at Xmas I helped a kid get her
creative zen to work with XP after the driver install failed. Creative
didn't have anything for me, so I went to a 3d party site with a busy
forum, and from there found my way to a site where someone had put
together a script. I had no way of knowing what might happen... but here
is this guy, volunteering his time to help people out for free, getting
strokes for it, and putting up a site with helpful information. Instinct
told me not to worry, and the script worked like a charm.

I've done similar things, but I suspect we're living in a golden
moment of innocence, before search-baiting takes off big-time.

As it is, the last time I followed Google links to look for Vista
drivers for an old scanner, I got an unexpected UAC pop-up as one of
the pages started to load in IE 7. Nice work, UAC...

The initial problem is that some sites "slurp" usenet and present it
as web forums surrounded by ads, droppers, etc.

Another problem are generic search traps; y'know, the ones that say
"BUY YOUR Fred Smith HERE!!!" etc.

But a bigger future problem may be bots that spawn pages hooked into
popular or topical searches. We've started seeing the same old spam
dressed up in topical news headlines as subject; it's a natural
extension, especially with so many hackable "spaces" that serve folks
without them having to "do" HTML.
Communism has proven to be wholly dysfunctional in the real world, but
something similar works pretty well on the Internet, eh? Bricklayer
today, poet tomorrow, programmer/IT consultant the day after that.

The "I can't believe it works, maybe there's hope for humanity yet"
miracle is Wikipedia... I can't believe how good it generally is.
People seem powerfully motivated to collaborate in open source
development and help others solve their IT problems.

Yup - but if Linux becomes the majority OS target, you'll see the
other sort of folks and their bots, too.
Good question. So far I have installed a large number of upgrades with
no problems. I will upgrade to Feisty on my test rig and then decide if
it makes sense for my "serious" computer.

When a new Ubuntu comes out, how does it work? Do you:
- instrall it over-old, and it updates everything? (the "SP" model)
- try for an in-place version upgrade? (the "new OS" model)
- wipe and rebuild? (the pessimist's "new OS" model)

Also, do you just apply the kernel updates/upgrade, or do you
re-install/upgrade every bundled package?

Is there an element of "upgrade" logic with regards to 3rd-party
bundled stuff, such as...
- if not installed, do not install?
- if installed and same/newer, do nothing?
- if installed and older, upgrade to new bundled version?

Are existing settings preserved? YMMV per bundled app?

In Windows, we generally find SPs OK (tho often with some impacts that
we quickly familiarize ourselves with) but new OSs are such a pain
that I generally do not apply them as upgrades, even if no licensing
issues were to apply (e.g. I have licenses as part of MSDN).

This is the crunch; when an SP level isn't supported (i.e. patched)
anymore, and one is "forced" onto the next OS - even when the next OS
doesn't like the "old" hardware spec we're currently using.

--------------- ----- ---- --- -- - - -
Never turn your back on an installer program
 
C

cquirke (MVP Windows shell/user)

You have put your finger squarely on the main problem.
In truth, the basic problems of desktop productivity have been solved.

Where extra computational power pays off, is in natural input. To
mimic natural input (OCR, speech, handwriting, 3D input localization)
takes orders of magnitude more power than natural output (fonts, voice
synthesis, fake script fonts, 3D theater sound).
Eventually, new technologies will emerge that require more RAM and CPU
power than today's systems can deliver. One example is 3d printing/rapid
prototyping, which is available now for commercial applications.

I can see a time when a house (or flat) has piped "stuff" (plastics)
that are consumed like print ink and paper are today; some desktop CAM
will craft solid goods to order under sware control.
when such technologies arrive in the mass market, will they require the
services of a massive operating system like Vista? Or will they call for
a nimble OS that supports specialized software, perhaps embedded in a
stand-alone device?

My guess is that folks will still want to use off-the-peg hardware,
i.e. PCs or the equivalent, and then it's either "do this on your
usual system" or "dedicate a system to this alone".

If you go the latter route, you'd prefer Windows (better driver
support) if you did not provide the system, and Linux (saves cost of
OS) if you do provide the system and can thus pick driven parts.

But there will generally be an evolution from dedicated system, to
general system - and from stand-alone smart device to a thinner device
that is driven by hosted drivers. Think hardware vs. software modems,
as an example, or hardware vs. software MPEG playback.

So if MS is kept on the run looking for new apps requiring more
powerful general systems, Linux is also kept on the run looking for
things that need so much power that a dedicated system is needed,
before power catches up and it becomes a fat bump in a general system,
with device logic devolved to driver code on this host OS.

That model is assuming a LOT of other things don't change first ;-)
Vista represents a the work of thousands working under great
pressure over a compressed time frame. Obviously that is not
a formula to "kill all the bugs."

It's a large (tho finite) time frame; perhaps even too much time,
allowing the complexity of the project to run away with the ball while
long-waiting users' expectations grew and grew.

Luxurious amounts of time, compared to patching under fire...
Well, people have been designing shells to fit over the CLI since the DOS
era. Microsoft has taken it too far, trying to make the PC as simple as
an ATM or a TV set. But computers are inherently complex, and problems
do arise, as millions of Vista early adopters are discovering to their
chagrin. The CLI doesn't over-promise, but it is a powerful tool once a
user takes some time to learn how it works.

I like to work under direct vision, so even in the DOS 3.3 and PICK
R83 era, I'd prefer file managers to command prompt scratching (tho in
PICK I generally had to write my own tools and UIs)

An UI doesn't have to be graphical to be useful - a hi-res text mode
(especially when prettied with custom characters for corners, drop
shadows, etc.) can give you most of the value of a GUI, in terms of
discoverability and getting things done.


--------------- ---- --- -- - - - -
"We have captured lightning and used
it to teach sand how to think."
 
S

Stephan Rose

cquirke said:
When a new Ubuntu comes out, how does it work? Do you:
- instrall it over-old, and it updates everything? (the "SP" model)
- try for an in-place version upgrade? (the "new OS" model)
- wipe and rebuild? (the pessimist's "new OS" model)

It's updated in-place, I just did it today. You can see my new post I made
today for more details. =)
Also, do you just apply the kernel updates/upgrade, or do you
re-install/upgrade every bundled package?

Everything (that needs to be) is reinstalled automatically.
Is there an element of "upgrade" logic with regards to 3rd-party
bundled stuff, such as...
- if not installed, do not install?
- if installed and same/newer, do nothing?
- if installed and older, upgrade to new bundled version?

Are existing settings preserved? YMMV per bundled app?

Yup, settings are preserved.

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

Charlie Wilkes

I've done similar things, but I suspect we're living in a golden moment
of innocence, before search-baiting takes off big-time.

I suppose. But the Internet ecosystem provides some checks and balances,
too. This particular individual could have planted a trojan in his
script... but I'll bet his script has been audited for errors or anything
else that might be wrong with it. The basic competitive nastiness that
we see all over Usenet serves a useful purpose when it takes place among
people who actually have some technical knowledge. What role is more
coveted than that of the uber-geek who discovers something everyone else
has missed?
Yup - but if Linux becomes the majority OS target, you'll see the
other sort of folks and their bots, too.

Amen. This is one area in which Microsoft is measured unfairly because
of its market share. Who is to say whether Linux or Windows is
inherently more secure? All we really know is that a thousand people are
trying to hack Windows for every one who is trying to hack Linux.
When a new Ubuntu comes out, how does it work? Do you:
- instrall it over-old, and it updates everything? (the "SP" model) -
try for an in-place version upgrade? (the "new OS" model) - wipe and
rebuild? (the pessimist's "new OS" model)

Also, do you just apply the kernel updates/upgrade, or do you
re-install/upgrade every bundled package?

Is there an element of "upgrade" logic with regards to 3rd-party bundled
stuff, such as...
- if not installed, do not install?
- if installed and same/newer, do nothing? - if installed and older,
upgrade to new bundled version?

Are existing settings preserved? YMMV per bundled app?

These are interesting questions I cannot answer. I did not get serious
about Linux until recently. I am a Vista refugee. I will not accept
Vista's license terms and intrusive behaviors, and it appears overall to
be a shoddy product designed by committee. So I want to prepare for a
future that doesn't rely on Microsoft.
This is the crunch; when an SP level isn't supported (i.e. patched)
anymore, and one is "forced" onto the next OS - even when the next OS
doesn't like the "old" hardware spec we're currently using.

This is an area where open-source offers an advantage, because nothing
ever seems to drop off the chart. The FreeDOS people have spent years
creating a DOS clone and they are continuing to develop it. The Arachne
DOS web browser runs on a 286.

Several versions of Linux support old hardware, including Damn Small
Linux, which is actually very good and easy to use. Minimum hardware is
a 486dx with 16mb of RAM.

I assume Ubuntu will follow the Windows curve, gradually dropping support
for old hardware. It's reaching the point where developers have to take
responsibility, because businesses are spending serious money for support.

I won't feel as resentful about losing support for software that didn't
cost me any money in the first place. Also, if enough people want
continued support for old versions of Ubuntu, it will happen just the way
DSL is happening, because the sources are open to anyone with an interest.

Charlie
 
N

norm

Charlie said:
I suppose. But the Internet ecosystem provides some checks and balances,
too. This particular individual could have planted a trojan in his
script... but I'll bet his script has been audited for errors or anything
else that might be wrong with it. The basic competitive nastiness that
we see all over Usenet serves a useful purpose when it takes place among
people who actually have some technical knowledge. What role is more
coveted than that of the uber-geek who discovers something everyone else
has missed?

Amen. This is one area in which Microsoft is measured unfairly because
of its market share. Who is to say whether Linux or Windows is
inherently more secure? All we really know is that a thousand people are
trying to hack Windows for every one who is trying to hack Linux.

Why is ms being measured unfairly? A simple search should provide a
multitude of verifiable info about the inherent security offered from
both ms and linux. With apologies to peter parker, "With great power
comes great responsibility." Ms has the great power part down pat. The
great responsibility part still seems to be coming up short. It would
seem that there have been enough "life lessons" in the past, and
continuing in the present, for ms to realize that something is amiss in
the path they have chosen. And it really shouldn't matter if there is a
1000 to 1 (or 10000 to 1) ratio of attackers of ms to attackers of
linux. Given the time that both have been in existence, it would seem
highly unlikely that at least ONE of the linux attackers would not have
had ONE major, stop the presses, success on par with the MANY successes
against ms.
 
C

Charlie Wilkes

Why is ms being measured unfairly? A simple search should provide a
multitude of verifiable info about the inherent security offered from
both ms and linux.

I'm skeptical of broad predictions.
Given the time that both have been in existence, it would seem
highly unlikely that at least ONE of the linux attackers would not have
had ONE major, stop the presses, success on par with the MANY successes
against ms.

??? Linux vulnerabilities have been found. I have read about some of
them. They probably don't get as much attention as Windows
vulnerabilities, but that doesn't mean they don't exist.

Charlie
 
C

Charlie Wilkes

So if MS is kept on the run looking for new apps requiring more powerful
general systems, Linux is also kept on the run looking for things that
need so much power that a dedicated system is needed, before power
catches up and it becomes a fat bump in a general system, with device
logic devolved to driver code on this host OS.

That model is assuming a LOT of other things don't change first ;-)

Hmmm. Your comments are very interesting, obviously informed by quite a
bit of technical knowledge. Your reasoning is sound, and your analogies
seem like exactly the right ones... but still, I wonder. In home CNC
(e.g., setting up a lathe to make widgets automatically), DOS turns out
to be superior to Windows because there are fewer layers between the
software instructions and the hardware actuators. So I wonder if the fat
bump of software will really do the job with future technologies in which
IT seeks to interact with the physical world on more intimate terms.

Charlie
 
N

norm

Charlie said:
I'm skeptical of broad predictions.
And you had opined: "This is one area in which Microsoft is measured
unfairly because of its market share."
Hmmmm. Your opinion should not cause any skepticism, but a statement
which suggests that there may be verifiable info available is a "broad
prediction" worthy of skepticism?
http://www.omninerd.com/2007/03/26/articles/74 may or may not provide
info that YOU would deem verifiable, but I offer it as an example of
info that is available.
??? Linux vulnerabilities have been found. I have read about some of
them. They probably don't get as much attention as Windows
vulnerabilities, but that doesn't mean they don't exist.
I did not say otherwise, and please do not suggest that I did. There
have been linux vulnerabilities noted and undoubtedly there are some
still not identified or yet to be created. However, I did say that there
are none that exist that have been exploited to have had any measured
destructiveness or other unpleasantness such as those found in the ms
product.
 
C

cquirke (MVP Windows shell/user)

I suppose. But the Internet ecosystem provides some checks and balances,
too. This particular individual could have planted a trojan in his
script... but I'll bet his script has been audited for errors or anything
else that might be wrong with it.

The time scales count against this. Here's a typical scenario:
- an issue arises, e.g. today's ANI exploit
- malware vendors leverage this for SE
- sites offering fixes pop up in the top few Google pages
- good sites may get hacked to drop the exploit as well

The response turnaround isn't days or weeks anymore, in an age where
botnets can send a brand new malware to all visible email addresses
within the first hour of release.

As a constantly-attacked platform, we've been pushed by these
circumstances into swallowing patches on release, as beamed in via the
vendor's real-time push. In effect, this gives a "blank cheque" to
the vendor, in terms of trust, which isn't something I'm comfortable
with. I don't know how a co-opertaive development community is going
to be able to respond as timeously without ill-advised trust
assumptions being required on the part of the user.
The basic competitive nastiness that we see all over Usenet
serves a useful purpose when it takes place among people
who actually have some technical knowledge. What role is more
coveted than that of the uber-geek who discovers something
everyone else has missed?

Yep, and that may be "good enough" for reasonably tech-savvy users
when dealing with "slow" needs, e.g. "does anyone have a driver for an
odl HP LaserJet 4?". But crisis patching, which is basically the only
reason we as users (as opposed to developers) give a care about vendor
support horizons, is another matter.
All we really know is that a thousand people are trying to hack
Windows for every one who is trying to hack Linux.

Yup. And perhaps more to the point, each one of those malware
developers can release many thousands of attacks in a single
generation. When considering your "daily av update" as realtime
protection, remember that SQL Slammer (a pure clickless network worm)
went from unknown to global in a matter of a few minutes.
These are interesting questions I cannot answer. I did not get serious
about Linux until recently. I am a Vista refugee.

I'm bracing myself for refugee-hood, and also far too inexperienced to
use Linux, much less support it at the level I support Windows.

In the Win9x era, I used to joke that if I had...
- 3 months off, I'd learn NT Workstation
- 6 months off, I'd learn NT Server
- a year off, I'd learn Linux
I will not accept Vista's license terms and intrusive behaviors

I'm reluctantly swallowing these, but I'm about to drop MS Office 2007
as a supportable product (IOW "if you want to buy that, I'm afraid
you'll have to get it esewhere as I don't consider if fit to sell").

There's nothing that wrong with MS Office 2007 itself; it's just that
the way we as system builders are supposed to supply it, has become
unacceptable. Not only does it come as a disk-less "air box", but the
disk pack we have to buy separately is about to masively increase in
cost. It smells so bad, that if the next OS is distributed with
similar strings attached, then I may well leave the platform.
and it appears overall to be a shoddy product designed by
committee. So I want to prepare for a future that doesn't
rely on Microsoft.

I haven't used it enough to respond to that assertion, though what
I've seen of it looks good, albeit controversially different. It's a
gamble to break UI expectations, and thus put Open Office on the same
playing field as a migration; I see why MS did it and it looks as if
they've done it well, once you get into its "head".

I was suggesting users look at both Open Office and MS Office 2007 as
possible "nexts" after the older MS Office they were used to. I'm now
going to actively advise against MS Office 2007, because I find the
vendor lock-in to be totally unacceptable - even if it is I, the OEM,
who is the "beneficiary" of this lock-in.

Everyone can be bought, just as every atom can be broken, but I can't
be bought in this particular way. It's like offering a Rabbi a
lifetime supply of free bacon as a bribe... does not compute.
Several versions of Linux support old hardware, including Damn Small
Linux, which is actually very good and easy to use. Minimum hardware is
a 486dx with 16mb of RAM.

Hmm... how small is DSL? I'm getting tired of 700M .ISOs that break
at the 95% downloaded point after eating the month's free capacity...
It's reaching the point where developers have to take responsibility,
because businesses are spending serious money for support.

That's the crunch, isn't it? Linux still reminds me of the
"enthusiast age" of computing, which is where I grew up on the ZX
Spectrum. When I started working pro on PCs, I felt really guilty
about charging $10 to edit someone's FAT to recover data, as I felt I
wasn't doing anything the user couldn't have figured out for
themselves. But's that's changed, as the gap betwen using PCs and
fixing them has widened and hardened, separated by the GUI.

At present, Linux devs do the things they'd like to use themselves,
and share these. It's quite different to have to do things you are
not interested in, or are utterly bored with, just because some uppity
client claims to need such things to run their business.

You know it's work, when you have to pay for it ;-)
I won't feel as resentful about losing support for software that didn't
cost me any money in the first place.

Generally I'd agree, until I have to buy something that needs the free
something else to run it, or when my livelihood depends on that free
thing working as it should. When the stakes are that high, I want to
be able to demand service, not beg for favors, and I'd expect to pay
for that committment. But you can't buy what isn't for sale.


------------------ ----- ---- --- -- - - - -
The rights you save may be your own
 
C

cquirke (MVP Windows shell/user)

In home CNC (e.g., setting up a lathe to make widgets automatically),
DOS turns out to be superior to Windows because there are fewer layers
between the software instructions and the hardware actuators. So I
wonder if the fat bump of software will really do the job with future
technologies in which IT seeks to interact with the physical world
on more intimate terms.

When it comes to efficiency vs. "big design", I'd bet on the latter,
and there are three aspects to this.

1) Abstraction has earned its keep

Nowhere is this clearer than in PC gaming. Initially, DOS was better
than Windows as a gaming platform for the reasons you state; you had
direct access to the hardware and that meant you had the efficiency to
do things on hardware of the time.

As long as a sound card successfully pretended to be a Sound Blaster,
and a graphics card successfully pretended to be an IBM VGA or a VESA
SVGA, you were set.

WinG didn't do much to change our minds on this, and then DirectX
started getting traction. Even so, it was initially a matter of "does
the PC have the power to pull the load through the DirectX middleman?"
in the pre-3D era of Quake 2, Hexen 2, etc.

Once factor (2) meant more hardware intelligence, the abstraction of
the hardware let hardware developers off the leash. No longer did a
sound card have to pretend to be a Sound Blaster to work (and thus be
limited down to whatever a Sound Blaster could do) - vendors were free
do develop hardware in any direction they liked, as long as they
provided DirectX drivers to link it through the standard API.

We saw the same thing earlier, when IDE mean HDs no longer had to have
set data encoding and sectors per track. Once again, development was
free to design storage hardware any way possible, as long as it could
be seen as CHS or a linear series of sectors (LBA) - and once again,
capacities and speeds took off like a rocket.

So no; I don't see much future for direct-hardware-access in general
use, though I do see firmware applications in dedicated hardware and
perhaps some real-time OSs for general systems dedicated to running
single apps that require real-time rigor.

As to DOS itself - much as I like it, it just doesn't scale up to
modern hardware that well (USB, GHz, RAM, > 137G etc.). I suspect a
lean Linux, even as lean as that which launches MemTest86 from a boot
CDR or diskette, is more likely the way to go.

2) Efficiency is a temporary advantage

Any business plan that relies on big system performance or spectacular
machine efficiency, has to pay for itself in a fairly short time
period. In contrast, a year's head start in building a new app that
barely runs on current hardware can set you up to dominate that field
for "life" (think Photoshop, CueBase, Premiere, etc.)

For example, let's say you want to be able to do CG video that's
smoother and more realistic than anyone can do at home. So you buy a
huge PC or other hardware system that's far more powerful than any
off-the-peg or self-built PC, and you start cranking out jobs.

You have a market life of maybe 2-3 years, tops, before you will have
to either renew your crushingly expensive hardware investment, or find
some other way to compete with all the PC small-fry who have caught up
with you simply buy buying their PCs a couple of years later.

For another example; let's say you write a competing application to
have a performance edge by working intimately with the hardware, being
written in C or Assembly, etc. with fewer abstraction layers.

Once again, you'll be best-of-breed for a year or two, but htis time,
initially at least, your edge stays in place as faster hardware lets
your more efficient software still outperform other apps.

But then, as the hardware changes and invalidates your original
assumptions, you have to re-develop much of the code, and that's a far
slower and riskier business than (say) getting library updates and
making a few top-level changes to something written in a higher-level
(read, protects you from shooting your feet off) language.

Compare word processors XyWrite, Word Perfect and Word. XyWrite was
hard to use, but beloved by news reporters because it was so much
faster when quickly banging out raw text. It had an edge, but an edge
that simply ceased to matter even before Windows took off.

Word Perfect did things "large" and made "difficult to use" a lock-in
advantage by training users in their arcane ways. Unlike XyWrite,
they relied on efficiency to support the richest available feature set
rather than simply be the fastest tricycle on the block. So as folks
expected more from word processors, they still found it in Word
Perfect... and then Windows came along, and Word Perfect made the
mistake of trying to ignore the OS and still do everything "their
way", in a quest for efficiency and richness. Oops.

3) Possible computing power is a narrow but shifting window

By that I mean the difference between the smallest and largest systems
cost-effective to produce, is far smaller than the best 3-year-old
systems and the worst systems 3 years in the future.

This is, in effect, a re-statement of (2). Let's say user A bought a
cheap 200MHz Pentium with 8M RAM and 512M HD, and user B spent a bomb
on Intel's cutting-edge 333MHz PII with 32M RAM and 2G HD. They're
both boat anchors by now, and not only that; it's no longer
cost-effective to deliberately build a PC as lame as either, in 2007.

Even when you go off general PC hardware (and immediately hit far
larger hardware development costs); it's still not cheaper to build
things smaller than a certain level, unles you are after a
particularly minute form factor (and pay accordingly).

You can think of possible single-unit computing power as a range as
narrow as visible light in the full em spectrum. If you want
substantially more power right now, you generally attain this by
ganging systems together, e.g. SETI, or botnets.

You can also think of computing power over years as that narrow
visible-light band being moved from radio waves through today's
"light" into UV, gamma and cosmic ray territory, much like the pointer
of a radio moves as you turn the tuner knob. All sorts of security
technilogies that are bound to power/time limits suffer short
usability lives due to this effect.

Things change qualitatively as well as quantitatively, and this will
have a greater impact if you've become too intimate with the hardware
in a quest for efficiency. Use the API, Luke...


The main thing is that it is harder to conceptualize something that
has never been possble to do before, than to make a known process more
efficient or re-create it on different hardware.


--------------- ---- --- -- - - - -
"We have captured lightning and used
it to teach sand how to think."
 
S

Stephan Rose

cquirke said:
Hmm... how small is DSL? I'm getting tired of 700M .ISOs that break
at the 95% downloaded point after eating the month's free capacity...

The ISO is 49.5 Megs, I just tried downloading it to check. =)

http://www.damnsmalllinux.org/

It can actually boot of a USB Stick if your PC supports booting from USB.

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

Charlie Wilkes

When it comes to efficiency vs. "big design", I'd bet on the latter, and
there are three aspects to this.

1) Abstraction has earned its keep

Hmmm. Perhaps it has, but there are a few exceptions here and there.
For example, I use Starband satellite Internet because I live in a remote
location where there is no cable or wireline phones. A few years ago,
Starband used to suck so bad they almost went out of business. They
passed out these sketchy routers that involved a complicated driver on
the host PC. People had no end of trouble getting the driver to work,
and it was for Windows only. Starband went into bankruptcy and had to
file a reorganization plan

Nowadays, Starband out of bankruptcy and offers a very good, reliable
service. They have replaced their routers with new ones that require no
driver whatsoever. Any system -- DOS, Windows, Mac, Linux -- with a
properly configured network card can connect instantly with no hassle.
2) Efficiency is a temporary advantage

3) Possible computing power is a narrow but shifting window

Your analysis is well thought out, and I accept your conclusions as to
what is commercially viable. But, as the window shifts, the question
becomes: What will be done with all that hardware power? I can't imagine
that software will continue to bloat out indefinitely; there must be a
practical limit. I mean, what will the desktop OS of 2020 look like? If
one compares Vista with Windows 95 and projects the curve forward, it
suggests a multi-terabyte installation that involves millions of
programmers and several hundred billion dollars of development costs.
But that seems absurd. It seems more likely that the focus will be on
drivers for complex hardware... the 3d printer, a holographic projector,
etc. How Linux will fare vs. Windows under that scenario is something
you can assess better than I can... but, it seems logical that open
source code will make it easier for developers to build devices and
embedded systems that work well together.

Charlie
 
C

Charlie Wilkes

I'm bracing myself for refugee-hood, and also far too inexperienced to
use Linux, much less support it at the level I support Windows.

In the Win9x era, I used to joke that if I had...
- 3 months off, I'd learn NT Workstation - 6 months off, I'd learn NT
Server
- a year off, I'd learn Linux

It's a learning curve, and I'm not really very far along, but I do feel
it has some inherent advantages over Windows. For one, everything that
can be done in the GUI can be done at the command line, and the command
line makes sense once you understand it. Also, there is no registry; all
the configuration data is in plain text files.
At present, Linux devs do the things they'd like to use themselves, and
share these. It's quite different to have to do things you are not
interested in, or are utterly bored with, just because some uppity
client claims to need such things to run their business.

It seems like the interest is spread pretty wide. My new laptop has a
software modem, and I thought I'd have to fill up the one PCMCIA slot
with a hardware modem, but it turns out there is a web site
(www.linmodems.org) run by people whose pet fascination happens to be
writing Linux drivers for Winmodems. And the Ubuntu base package
includes the driver they have written for my particular device.
Generally I'd agree, until I have to buy something that needs the free
something else to run it, or when my livelihood depends on that free
thing working as it should. When the stakes are that high, I want to be
able to demand service, not beg for favors, and I'd expect to pay for
that committment. But you can't buy what isn't for sale.

One assumes service ought to be available on demand because of the high
cost of a Windows license, but I have found that to be a false security
blanket. Last year I built a new machine and installed Windows on it...
it worked fine until one day I got a BSOD while booting, and could not
boot except in safe mode. So I went to Microsoft's web site, looked up
the problem, and they said I'd have to contact tech support for a hot
fix... ok, paid $100 for one of their people to work with me by email,
sent them buckets of data for analysis, went back and forth for a week or
two, and in the end they couldn't figure out what was wrong, and I had to
reinstall everything and hope it didn't happen again. At least I got my
$100 back.

Charlie
 
S

Stephan Rose

Charlie said:
Hmmm. Perhaps it has, but there are a few exceptions here and there.
For example, I use Starband satellite Internet because I live in a remote
location where there is no cable or wireline phones. A few years ago,
Starband used to suck so bad they almost went out of business. They
passed out these sketchy routers that involved a complicated driver on
the host PC. People had no end of trouble getting the driver to work,
and it was for Windows only. Starband went into bankruptcy and had to
file a reorganization plan

Nowadays, Starband out of bankruptcy and offers a very good, reliable
service. They have replaced their routers with new ones that require no
driver whatsoever. Any system -- DOS, Windows, Mac, Linux -- with a
properly configured network card can connect instantly with no hassle.

Your analysis is well thought out, and I accept your conclusions as to
what is commercially viable. But, as the window shifts, the question
becomes: What will be done with all that hardware power? I can't imagine
that software will continue to bloat out indefinitely; there must be a
practical limit. I mean, what will the desktop OS of 2020 look like? If
one compares Vista with Windows 95 and projects the curve forward, it
suggests a multi-terabyte installation that involves millions of
programmers and several hundred billion dollars of development costs.
But that seems absurd. It seems more likely that the focus will be on
drivers for complex hardware... the 3d printer, a holographic projector,
etc. How Linux will fare vs. Windows under that scenario is something
you can assess better than I can... but, it seems logical that open
source code will make it easier for developers to build devices and
embedded systems that work well together.

Well another thing is this. Linux will catch up, it essentially already has
in most, areas that it lacks with windows and significantly beats it
already in other areas.

Hardware support also can only get better, not worse...and it is quite
excellent already.

Software support also can only get better...and it already meets any and all
average non-specialized needs quite well.

It's market share is growing which will only accelerate the above.

In my honest opinion, I think MS is finding itself in a dead end. Office
already dead ended years ago. The only new features it generally sees are
new user interfaces....but one can only make a UI so efficient until it
becomes pointless to modify it any further.

I already had a hard time envisioning an upgrade for Office 2003.

The same goes for the operating system. XP may have some flaws and Vista may
even fix *some* of them...but where is it going to go from there? Is the
next version after Vista going to move the start menu to the right hand
side and call it a new feature to justify everyone spending 200-400 bucks
yet again?

I, and many people I know, already don't see a point in Vista's existance.
From that standpoint, I can't even imagine a reason for a new version after
Vista beyond fixing problems MS themselves have created in their own
product.

I think operating systems such as linux have a much stronger potential for
future as they don't need to support the costs of a monolithic corporation
such as MS. I place more faith into the developers of linux as they will
have the *users* interests in mind when developing than I place faith in
MS' developers. MS' developers have managers and beancounters breathing
down their necks and have to keep the corporations interests in mind over
the users interests. Vista is a prime example of exactly that happening,
DRM and WGA come to mind.

I think that as operating systems will eventually reach a plateau in terms
of functionality that Windows is basically doomed for failure as people
will no longer have a reason to "upgrade" to a new version. There won't be
any reasons to upgrade unless the reasons are artificially generated. We're
already seeing this with Vista. Artifical software incompatibility and DX10
for instance both fall into that category for me to create reasons for
users to upgrade that otherwise wouldn't have one.

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

Charlie Wilkes

Well another thing is this. Linux will catch up, it essentially already
has in most, areas that it lacks with windows and significantly beats it
already in other areas.

Hardware support also can only get better, not worse...and it is quite
excellent already.

It seems like it has surged in the last year or so. If I were to set up
a system for someone who knew absolutely nothing about computers, I think
I would choose Ubuntu over any flavor of Windows. It's a cleaner
interface. Getting the box set up initially involves some CLI/text-
editing operations, but so what?
I think that as operating systems will eventually reach a plateau in
terms of functionality that Windows is basically doomed for failure as
people will no longer have a reason to "upgrade" to a new version. There
won't be any reasons to upgrade unless the reasons are artificially
generated. We're already seeing this with Vista. Artifical software
incompatibility and DX10 for instance both fall into that category for
me to create reasons for users to upgrade that otherwise wouldn't have
one.
You and I are seeing exactly the same thing. The desktop OS is mature.
Steve Ballmer can hire all the shills in the world, and he can throw all
the tantrums he wants, but he can't bring back the 90s. Slowly but
surely, the Microsoft hegemony is starting to break apart.

Charlie
 
S

Stephan Rose

Charlie said:
It seems like it has surged in the last year or so. If I were to set up
a system for someone who knew absolutely nothing about computers, I think
I would choose Ubuntu over any flavor of Windows. It's a cleaner
interface. Getting the box set up initially involves some CLI/text-
editing operations, but so what?

With the new version of Ubuntu that is coming out on the 19th, the CLI/text
editing may for many installs not even be necessary anymore.

Proprietary drivers such as nVidia / ATI can be installed with a few
mouseclicks now. Same goes for video / audio codecs.
You and I are seeing exactly the same thing. The desktop OS is mature.
Steve Ballmer can hire all the shills in the world, and he can throw all
the tantrums he wants, but he can't bring back the 90s. Slowly but
surely, the Microsoft hegemony is starting to break apart.

Agreed. The next few years ought to be interesting to watch and see what
happens. =)

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
Z

Zootal

Agreed. The next few years ought to be interesting to watch and see what
happens. =)

--

Yah. It's here. I'm running Slackware, XP, 2000, and even Win98 on a few
machines. But I doubt I'll be upgrading to Vista in the near future. Why? I
have no reason to upgrade. Vista will do NOTHING that I need that the OSs
I'm using won't do.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top