So you really think Linux is better than Vista, do you?

C

cquirke (MVP Windows shell/user)

cquirke (MVP Windows shell/user) wrote:
I am pretty sure stuff like that can be done, though I am personally not
concerned with seperating stuff to *that* high of a degree. If I got my
data seperate from my OS, I am generally happy.

I like to scope things out with various scenarios in mind - something
that is a big part of backup theory (i.e. preparation for anticipated
disasters and redeployment scenarios)
- scope out hardware-specifics in case hardware fails/changes
- scope out all possible malware
- exclude incoming material
- exclude infectable material
- scope in/out user-unique data and settings
- scope for application specificity and version independence

Doing so makes it easier to do abstract things like "I want to sell my
PC" (remove user-specific data), "I have to 'just' format ans
re-install because I may be infected" (safe data backups), "my
motherboard died and I have to rebuild", "I have a new version of a
crucial app; will my data still work?" etc.

In addition to this, a small core data set can more esily be automated
for more frequent backup with a greater depth of retained instances.
This is the main reason to exclude bulky pictures and off-the-peg
bulky music and movies from "data backups".
I have to say this though, I've found linux to be exceedingly more flexible
with stuff that you can do in the file system that I can only dream of in
XP (though *some* functionality can be added in XP via 3rd party stuff).

If "all non-trivial systems have bugs", then the only way to keep core
subsystems bug-free is to keep them trivial. For this reason, I
generally prefer a trivial file system to something more baroque and
feature-rich such as NTFS. To this day, I prefer to avoid NTFS even
though it's far more efficient where large file size and large numbers
of files per directory are concerned.

A file system is more than just the structures and basic operating
code, just as a user+PC is more than just the user and the PC. Just
as the user+PC is extended via available tech support, so a file
system must be extended to include careful repair and data recovery
tools. This is where NTFS currently still sits firmly on its ass.
Finally someone that doesn't worship the ground Quickbooks walks on. =)

It's supposed to be "easy to use" for those who are not steeped in
bookkeeping lore. I used it for a year, and it was a complete
disaster; never again, etc.

Firstly, QuickBooks drank MS's IE4-era Kool-Aid, and is massively
dependent on IE to provide its UI. So anything that re-versions,
updates, attacks, exploits etc. IE may also clobber QuickBooks.

Hmm... unavoidable richly-exploitable edge-facing IE, core
private-data-handling accounting package. Pick ONE.

Secondly, despite assurances to the contrary, QuickBook is hard-wired
to 3-month VAT accounting periods. I need 2-monthly VAT reporting,
and QuickBooks simply doesn't do that - so everything I did in
QuickBooks, I had to re-duplicate in Excel to do VAT.

So effectively, QuickBooks did nothing to reduce my dependence on
Excel, nor reduce the data input work I had to do in Excel.

Thirdly, it is nearly impossible to get data out of QuickBooks in any
sort of readily-usable form. Like most dedicated accounting apps, it
locks its data into its own proprietary format... gog forbid you
should be empowered to walk away to a competing product, and take your
data with you. "All your data belong to us!", heh.

Fourthly, many pro-IT types will tell you thay loathe QuickBooks
because it can only be run with admin account rights, making it VERY
risky to deploy in large managed environments. This is part of the
reason we have to put up with UAC in Vista right now.

So... it should be really easy to find folks who dislike QuickBooks
:)
It was Office 2003, while I like the UI of 2007, the UI alone does not
justify the expense for me to get it.

Yep. I just reminded myself why I still do admin in Word and Excel; I
re-visited Access 2003 and fled screaming, then dipped into Open
Office Base (their database app) and fled screaming again.

Database is still tough - and that's from someone who grew up
programming in database-orientated PICK in the DOS 3.3 era.
I do it for internal in-house stuff so spending weeks on a well designed UI
would be a waste usually. =) The only exceptions to that rule are if it
goes to people in production or so and they have to use it on a daily
basis, then it does get an UI to precisely suit their needs.

What I often do, is:
- find a CLI-driven engine
- wrap it in a parameter-feeding batch file
- drive the batch file from UI shortcuts containing parameters

Parameters I don't need to change, I hard-code in the .bat
Parameters that are site-specific, I define in a Set block in the .bat
Parameters that are context-sensitive, I pass from the shortcut
On-the-fly parameters can be passed via other integrations

For example, I may write generic VirusCheck.bat and VirusKill.bat
wrappers for some CLI-driven av scanner, then integrate this into the
GUI via File Types, File Folder; SentTo; QuickLaunch icons to scan
diskettes and a "suspect" subtree, etc.

Then if I change the av engine, all I have to do is change the two
batch file wrappers; I don't have to unpick all the UI points.

In this way, I can serialize a number of on-denad scanners into a
large hammer that I can easily swing at incoming material concentrated
in "suspect" subtree, or diskettes, or arbitrary files etc.

MS hasn't begun to think in this direction; for example, there's no
namespace shell folder matching the "suspect" concept as yet.
That I honestly don't know off the top of my head. I usually just access it
via the easily accessible system menu. There probably *is* some shortcut
for it, I just don't know it.

Where's that "easily accessible system menu"?
Can it be reached if the mouse has failed?
Can it be reached if the system is "busy"?
Is the access consistent across all distros and OS versions?
Depends on your needs. You can either store them locally in the directory
you need them or store them in /usr/bin if they are to be globally
accessibly from anywhere and by any user.

What batch language?
Where can I see the batch file syntax?
Can I get syntax help on the fly, e.g. "CommandName -?"?
How to invoke them?
How to have them accessible via "the Path" <- old DOS concept?
Generally...you don't unless you need to mount it. No switching between
drives like you do in windows. The fact that my home and root directories
are on seperate partitions, where in windows each would have its own drive
letter, is not even visible from the file system here.

Once its set up. But on first entry to a dual-boot installation, any
volumes in an extended partition that one might intend to use for
shared data access are simply not visible at all.

And once again, to fix this obvious Day One need, one has to grope
around with guessware CLI syntax, assuming you can find somewhere to
type in such commands. Remember, we are talking about Day One here,
i.e. things we need to do right away, not after two weeks of learning.
You can map *any* directory *anywhere* to virtually *anything* and what it
is actually mapped to is invisible when accessing it.

Good and bad sides to that, but the immediate problem is that one
doesn't know how to do this. Functionality inaccessible is
functionality denied, and functionality more accessible to attackers
is a menace (hence my problem with "let's build a network client OS,
treat the Internet as just another network, drop this into
consumerland as the new Windows, and see what happens")
In my case...

/ = sdc1
/home = sdc3
/media/windows = /sda1 (WinXP drive)
/media/data = /sdb1 (2nd NTFS data drive, I may convert this to ext3 and
make it a linux drive soon)
/media/cdrom0 = My dvd drive
/cdrom = Same as /media/cdrom0

O..K.. :)
For example, I have all my favorite DVDs saved as images on the hard drive
so the original disks don't get damaged. I could then go ahead and create a
directory for each DVD by name and then map the corresponding image to the
directory. Then when I want to watch it, just point my media player at the
directory...and it plays!

That's cool; there are some things (Virtual CD, Daemon Tools) that do
the same for Windows, but mileage can be rocky when changing OS
versions, dealing with "protected" disks etc.
Files generally still have extensions just under windows. The only thing
that generally doesn't have an extension are things that are executable, be
they binaries or scripts.

Great. So the most dangerous file types are the ones that can't be
identified, and can pass themselves off as anything else. This is as
bad as Windows is becoming, once you combine:
- .pif, .exe etc. can set their own icons
- file name extensions are hidden by duhfault
- raw code in .pif is run as if it were in an .exe
- .pif extension is never shown even if extensions set to show

Eww... I'd hoped Linux would be better there, but it seems as if it
never had the safety that Windows is steadily losing.
But other than that..images, source files, whatever else you can come up
with...have the same extensions as they do under windows (excluding
application-specific stuff of course).

OK. So if a file is called blahblah.TXT, can I trust it not to run as
raw code if I "open" or otherwise evoke it?
Thing is malware can do extremely little damage to my linux system. It can't
access anything actually important. It could mess with stuff in my /home
directory...that's about it...and there isn't much useful there it could
do.

IMO, the notion that user account rights are effective protection
against malware is absurd, given that even the most limited user
account will allow editing of that user's data - and thus any malware
with the same rights can steal or destroy that data.

If you don't care about user data, and just want to avoid the support
hassle of getting the oS running again, then account rights are fine
for that - but you're serving your own agenda, not ours.
I'd have to go through quite some trouble to give malware root access and
actually let it do damage.

There are two levels to that:
- how the system is designed to work
- how the system actually works

Through DOS, Win3.yuk, Win95 and inteo Win98, almost all (if not all)
attacks were at the first level. But in the XP era, the second level
has become so important that we no longer treat the system in a purely
deterministic manner.

For example, in the "old days", I'd have no safety problem with a
thumbnailer delving into content to create a small representation of
the file, or Windows Explorer doing this automatically when listing
files, or an indexer reading files in the background while I work.

This is no longer the case - any unsolicited handling of complex
material is a risk to be avoided.

Right now, Linux has yet to attain the combination of complexity and
exposure that has put Windows to this particular sword - and I fear
there's a lot of "it can't happen here" blindness at work.

By design, content can't escalate for XP/Vista account rights to admin
or system rights. By exploit, this can happen at any time, all over
the world, within a day, if a suitable malware is mass-released.

That suddenness and resource-swamping scale is why I've taken malware
as seriously as I have, since the last century. It's impossible to
meet SLA promises if 20 clients need 6 hours work from one tech within
the same 24 hours - and malware can demand that, out of the blue.
Honestly I am not sure on the exact release dates of Edgy and Dapper. Dapper
is version 6.06 and Edgy is the newer 6.10 version.

I think I was playing in the 5.x era of Ubuntu.
I do know though that they are having a new release though coming up April
19th that is in heavy beta testing right now. And unlike Vista, when it
comes out, it won't break 80% of my applications. =)

Heh... I remember reading wads of verbage on whether ubuntu was a true
enough Debian for Debian installation techniques to be expected to
work. There are masses of per-distro details, when it comes to
predicting whether something "written for Linux" will work in your
Linux, and that's without considering compiling from source.

So I wouldn't tilt at that particular windmill, if I were you ;-)
Well that and coupled with the fact that infecting a linux system with
malware, while I won't call it impossible, is by default already
significantly more difficult than under windows.

I don't think that's proven until both have been tested equally.

It's like "Communism *will* work, it's just that it can't work unless
the whole world is Communist so we don't have to compete with more
efficient economies that wave shiny things at out citizens", but in
reverse, i.e. "Linux can be assumed to be malware-free only as long as
it remains a niche market".
Honestly, developers need to simply just ditch DX and use OpenGL and be done
with it. There isn't a thing DX can do that OpenGL can't.

OpenGL + OpenAL and games can easily developed for *any* platform...

I am doing that precise thing with my own applications...it works very
nicely.

I'm not sure how interchangeable they are, in terms of what DirectX
was designed to be, i.e. a highly-efficient gaming platform. For
starters, DirectX spans graphics, sound, LAN, HIDs whereas OpenGL is
purely a graphics API.

But that's another debate... :)


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
C

cquirke (MVP Windows shell/user)

On 18 Mar 2007 05:48:33 -0700, "(e-mail address removed)"
On Mar 11, 7:25 pm, "PTravel" <[email protected]> wrote:
Compile Gentoo with Beryl 2.0 and you get a fast slick 3D accelerated
desktop even using onboard Intel 64Mb video.

"Compile"? I expect the factory to do that ;-)
Now try doing that with Vista and you will have two graphic solutions
- Vista Basic and Classic.

Intel's integrated graphics do Glass from 945G upwards, with 965G
preferred. AFAIK, the 915G is the last one that can't Glass/Aero.

OTOH, if you're AGP-era 8xxG, then sorry, you're "old" :-(

Let's say it again: Vista is a NEW OS for NEW systems.


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
C

cquirke (MVP Windows shell/user)

On 19 Mar 2007 02:19:34 -0700, "(e-mail address removed)"
How about the "magical" screen refresh rate change upon logon (anyone
got a fix so I can 'lock' the refresh rate!!)

Unlike XP, Vista may support per-user refresh rates - which is a
real-word need when you mix pre- and post-presbyopic generations on
the same PC :)

In which case, the per-user setting will be asserted after logon, as
before logon, the user account has not yet been selected.
Windows explorer yet again without dual pane.

Actually, I find Vista is better here. I'm using dual-pane Windows
Explorer in Vista (i.e. folders tree in left, folder contents in
right) and I didn't have to do the usual "set Explorer not Open as
default action for Folders" thing to do so.

That in turn means that graft-ons to native Explorer functionality
(e.g. exploring into archives) now works seamlessly, instead of
opening loose 1-pane windows as it does in XP.

For me, this is a significant improvement over XP.
What used to be done with a few mouse clicks now takes several (ie.
personalize your desktop).

I haven't noticed much difference there. Some important things are
way better, e.g. if you want to know or control whether you are
exposing file shares to the network, it's far easier to do now.
Even for basic file op's, vista is paranoid. Are you sure you want to
do this, son??? ;)

Well, I think real-world experience (e.g. 95% spam is sent through
botnets) is evidence that such paranoia is appropriate.

Easy to say a few months after a major Windows revision, heh heh
Even with onboard low end Intel 64Mb cards, I get full hardware
acceleration effects:

http://www.beryl-project.org/features.php

Now try that with Vista!

I did, and the only reason I don't have Aero and Glass in the new PCs
I build is because I don't consider the benefits of Home Premium over
Home Basic to be worth the money.

When it comes to "what matters", I see Glass and Aero as so far down
the list that they border on irrelevance. I want an OS that is
effective to work through, not just look at :)




--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
S

Stephan Rose

cquirke said:
Hmm... generally I prefer to avoid per-file content digging when
listing files, for two reasons. Firsltly, because I want to eyeball
as many files as I can without scrolling. Secondly, because groping
content exposes potentially exploitable surfaces to material that I
may want to avoid "opening" in any way.

Well it only reads the first few kb or so of the text file to display its
contents, it doesn't actually execute any of the content in the process. I
don't see that causing a problem even if the content was bad.
I admit I'm coming at this with expectations preconcieved from the DOS
era, i.e. that there will be a non-spoofable predictor of file risk
level that is indivisibly as visible as the file name itself.

This was true in the DOS era ("don't run .exe, .com or .bat"), had
become less true in modern Windows due to the erosion of type
discipline and the data/code distinction, but may never have been true
in *NIX. How do *NIXers know in advance whether an arbitrary file
poses the low risk of "viewing data" vs, hi risk of "running code"?

Personally, both in the windows world and *nix world I stick to one simple
rule: Download and use files from reputable sources only. Period.

That has enabled me, even on windows systems, to run without *any* kind of
anti-virus or spyware protection for years.

Occasionally I will actually bother to scan the system. Worst-case scenario
when running an anti virus scan after 1 year, it *may* find an attachment
in an e-mail sitting in the "Deleted Items" waiting to be trashed that
slipped past my e-mail servers scanner...

I find that simple common sense is still the best protection regardless of
the operating system.
Ewww... wrong answer ;-)


R/test/text/ ?

Ya, text. Sorry =)
You see, this is where a battle-hardened "urban" Windows user may have
stronger street-smarts than a "village" MacOS or *NIX user :)

For us, "trust it, it's usually safe" is just soooo much the "wrong
answer", because we're used to content routinely attempting to attack
us. A file from "outside" is always treated malware unless proven
otherwise, and I don't consider av scanning as "proof". A file from
"inside" that is infectable (code, macro-enabled "data") is also to be
considered suspect in many contexts, such as inclusion within backups
that are to be resorted to after unexplained data loss.

So I really do want to know whether a file is "data" (safe to view) or
"code" (higher risk of running scripts etc.) as soon as I see it.

Now you may say "oh, don't worry, no-one bothers to attack Linux" but
that may change if Linux gains market share. If Linux never gains
market share, many of us would see no reason to bother with it (it may
suit our personal needs, but isn't relevant if we are to support or
supply a wider market). If we are interested because we anticipate
Linux gaining significant market share, then Linux has to be able to
weather the malware attention this growth will attract.

It's asserted that Linux gets less malware attention because it is
fundamentally harder to attack (either due to tribalism breaking up
the target volume, or inherent OS strengths) and it's been
counter-asserted that Linux gets less malware attention purely because
it is too small a target to bother with.

We don't have to predict an answer to that debate, but we have to
acknowledge the answer will remain unknown (no matter how vigorously
asserted) until Linux gains enough share to be put to the test.

Personally I figure the answer is in between both extremes. I would never
say it is impossible to infect a linux system with malware, but it is
substantially harder by design.
They will have to, if they are just trying out a Linux via dual-boot,
and the Linux-imposed boot loader auto-boots Linux after a brief delay
during which they have to scrable to choose non-default Windows
instead. It's generally the first change made after setting up.


In the context of advocating Linux to Windows users, dual-boot is
likely to be the norm, unless forks try out Linux as a "live" optical
disk booted form. I do like "live" Linux, but it would be unfair to
evaluate performance on that basis, so once the basic usability passes
muster, the next step is to set up a dual-booted HD installation,
unless there's a spare PC lying around.


It can be argued that Linux itself is simply not for them, either.

Dunno, I tested it with my mom who is not a very technical person. I simply
replaced her XP laptop with my ubuntu laptop for a day...absolutely *zero*
problems. I wasn't even around to answer any questions, etc.

But she still isn't the kind of person who could, or would want to, mess
around with the boot configuration in any way.
If the installation process creates the bootloader and thus imposes
unwanted bootloader settings, then it must provide an UI, and
preferably an easy GUI, to fixing these.

We bash MS for anticompetitive behavior if a tortuous or
counter-intuitive UI has to be followed to undo a setting thay want to
impose and we want to change. Same should apply to Linux.

I can see where you are coming from for people wanting to try out linux as
dual-boot. In this particular scenario, I actually do agree with you. Then
again, the same could be said for windows too..last I checked it's default
multi-boot compatibility is "Wipe out the dual boot and install itself
only" IIRC. =)

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
N

Nina DiBoy

cquirke said:
On 18 Mar 2007 05:48:33 -0700, "(e-mail address removed)"


"Compile"? I expect the factory to do that ;-)


Intel's integrated graphics do Glass from 945G upwards, with 965G
preferred. AFAIK, the 915G is the last one that can't Glass/Aero.

Good information, thanks for that! Me, my boss, and my bosses' boss all
have tablets with the 915G Intel graphics. This will probably motivate
them to have new ones purchased for us when it is finally upgrade time! :)
OTOH, if you're AGP-era 8xxG, then sorry, you're "old" :-(

Let's say it again: Vista is a NEW OS for NEW systems.


Saws are too hard to use.
Be easier to use!

--
Priceless quotes in m.p.w.vista.general group:
http://protectfreedom.tripod.com/kick.html

Most recent idiotic quote added to KICK (Klassic Idiotic Caption Kooks):
"DRM is not added to anything in Vista."

"Good poets borrow; great poets steal."
- T. S. Eliot
 
S

Stephan Rose

cquirke said:
I like to scope things out with various scenarios in mind - something
that is a big part of backup theory (i.e. preparation for anticipated
disasters and redeployment scenarios)
- scope out hardware-specifics in case hardware fails/changes
- scope out all possible malware
- exclude incoming material
- exclude infectable material
- scope in/out user-unique data and settings
- scope for application specificity and version independence

Doing so makes it easier to do abstract things like "I want to sell my
PC" (remove user-specific data), "I have to 'just' format ans
re-install because I may be infected" (safe data backups), "my
motherboard died and I have to rebuild", "I have a new version of a
crucial app; will my data still work?" etc.

In addition to this, a small core data set can more esily be automated
for more frequent backup with a greater depth of retained instances.
This is the main reason to exclude bulky pictures and off-the-peg
bulky music and movies from "data backups".

Sure but stuff like that I just seperate via directories. No need to create
partitions for those. =)

Like for instance, I have 2 directories, "Software" and "Hardware"
containing my C++ projects and CAD Projects respectively. Both are already
automatically backed up to a remote server via source control but are also
easily backed up manually since its the only 2 directories I really care
about.

Fourthly, many pro-IT types will tell you thay loathe QuickBooks
because it can only be run with admin account rights, making it VERY
risky to deploy in large managed environments. This is part of the
reason we have to put up with UAC in Vista right now.

So... it should be really easy to find folks who dislike QuickBooks
:)

Well so far lots of people in this NG have popped the question "Will
QuickBooks run in Linux??"...the only reason I even really think of
quickbooks is because I've read it so many times in this NG already. =)
Yep. I just reminded myself why I still do admin in Word and Excel; I
re-visited Access 2003 and fled screaming, then dipped into Open
Office Base (their database app) and fled screaming again.

Database is still tough - and that's from someone who grew up
programming in database-orientated PICK in the DOS 3.3 era.

I never liked Access and in that regard, without even ever having opened it,
I doubt Open Office Database is any better. I've seen some *horrible* stuff
done in Access...still gives me nightmares.
Where's that "easily accessible system menu"?

At least if using ubuntu, it's the menu next to "Applications" "Places"
staring you in the face on the top left ;)
Can it be reached if the mouse has failed?

Alt + F1 =)
Can it be reached if the system is "busy"?

The entire system can't really go "busy" here, the multi tasking is a little
different than under windows. An appliation can't hog the entire CPU and
make the OS unresponsive like it can in windows.
Is the access consistent across all distros and OS versions?

Across distros? No. There is a major difference between Ubuntu and Kubuntu
alone, not to mention other distros. Kubuntu is a lot more windows-like
(even has a control panel!) than Ubuntu.
What batch language?

There are multiple options I do think. I'd have to look it up myself.
Where can I see the batch file syntax?

As the syntax is going to depend on what you use, the answer will depend on
that.

I personally do very little scripting in general, I generally implement
everything in C/C++ and just create an executable to do the job. =)

Therefore that is really something difficult for me to answer as I've got
little experience with it.
Can I get syntax help on the fly, e.g. "CommandName -?"?

From a terminal:
man CommandName
or CommandName --help (usually)
How to invoke them?

Type in the name of the script =)
How to have them accessible via "the Path" <- old DOS concept?

I am sure there is a path-like thing somewhere that defines the default
search paths when you go try to run a file like the PATH= in DOS. Seeing as
I've not yet had the need to modify it though...I haven't really bothered
to look into it.
Once its set up. But on first entry to a dual-boot installation, any
volumes in an extended partition that one might intend to use for
shared data access are simply not visible at all.

And once again, to fix this obvious Day One need, one has to grope
around with guessware CLI syntax, assuming you can find somewhere to
type in such commands. Remember, we are talking about Day One here,
i.e. things we need to do right away, not after two weeks of learning.

No need to use CLI to setup your partitions, not with Ubuntu anyway. It's
install includes a GUI-based partition manager that can do everything
needed with mouse clicks. =)
Good and bad sides to that, but the immediate problem is that one
doesn't know how to do this. Functionality inaccessible is
functionality denied, and functionality more accessible to attackers
is a menace (hence my problem with "let's build a network client OS,
treat the Internet as just another network, drop this into
consumerland as the new Windows, and see what happens")


O..K.. :)

Some little clarification on sda1,sdb1,sdc1/3
sda1 = 1st drive (320gig SATAII)
sdb1 = 2nd drive (300gig SATA)
sdc1 = 1st Partition on 3rd drive, 10 gigs
sdc3 = 2nd Partition on 3rd drive, remaining space - 6 gigs for swap
partition
That's cool; there are some things (Virtual CD, Daemon Tools) that do
the same for Windows, but mileage can be rocky when changing OS
versions, dealing with "protected" disks etc.

See that's one thing I don't have to worry about here. =) This stuff works
the same way now as it did several verisons ago and it will still work the
same way several versions later.
Great. So the most dangerous file types are the ones that can't be
identified, and can pass themselves off as anything else. This is as
bad as Windows is becoming, once you combine:
- .pif, .exe etc. can set their own icons
- file name extensions are hidden by duhfault
- raw code in .pif is run as if it were in an .exe
- .pif extension is never shown even if extensions set to show

Eww... I'd hoped Linux would be better there, but it seems as if it
never had the safety that Windows is steadily losing.

File extensions are meaningless though. Anyone can take *any* file and give
it *any* extension...so what good does that do? To me it's nothing more
than part of the file's name.

That said, I did forget about this when I first wrote the response. You
actually can differentiate executables even on the commandline level.

When I get a directory listing, it is displayed the following way:

- Regular file in white
- Directory in blue
- Executable in green

So any green file, regardless of extension, is marked as executable.
OK. So if a file is called blahblah.TXT, can I trust it not to run as
raw code if I "open" or otherwise evoke it?

Yes you can. If say you downloaded that file from the net and saved it on
your desktop. It will be saved as a normal file. It will *not* be
executable under any circumstances. Even if you changed it's extension to
EXE...that extension is meaningless around here.

To execute the file, you'd have to manually go to the file properties
(accessible via the right click menu just as in windows) and *mark* it as
executable yourself and then execute it.

This only applies to manually downloaded files though. It does not apply to
applications installed via a package (think of it similar to the MSI
installer under windows) as in those cases the package manager will do
everything necessary and you just click the icon once the app is installed.
=)

Of course making sure the package contents are not bad for your system is up
to *you* as the user to decide. There is only so much any OS can do.
IMO, the notion that user account rights are effective protection
against malware is absurd, given that even the most limited user
account will allow editing of that user's data - and thus any malware
with the same rights can steal or destroy that data.

Sure, malware could technically go and wipe out my entire home directory.
There's not much in place preventing it from doing that. But where do you
draw the line? It's got to be drawn somewhere else you end up with UAC and
Vista.

At the very minimum though, the OS itself and all applications remain
unaffacted and the system remains in a usable state.

Reasons like this though, on any OS, is why I keep all my important data
backed up on servers located in a physically different location not
anywhere near me.

Anyone relying on their data being safe on their home PC is asking for
trouble, malware or not. Hard disk failures do happen...I've had plenty.
There are two levels to that:
- how the system is designed to work
- how the system actually works

Through DOS, Win3.yuk, Win95 and inteo Win98, almost all (if not all)
attacks were at the first level. But in the XP era, the second level
has become so important that we no longer treat the system in a purely
deterministic manner.

For example, in the "old days", I'd have no safety problem with a
thumbnailer delving into content to create a small representation of
the file, or Windows Explorer doing this automatically when listing
files, or an indexer reading files in the background while I work.

This is no longer the case - any unsolicited handling of complex
material is a risk to be avoided.

Right now, Linux has yet to attain the combination of complexity and
exposure that has put Windows to this particular sword - and I fear
there's a lot of "it can't happen here" blindness at work.

By design, content can't escalate for XP/Vista account rights to admin
or system rights. By exploit, this can happen at any time, all over
the world, within a day, if a suitable malware is mass-released.

That suddenness and resource-swamping scale is why I've taken malware
as seriously as I have, since the last century. It's impossible to
meet SLA promises if 20 clients need 6 hours work from one tech within
the same 24 hours - and malware can demand that, out of the blue.


I think I was playing in the 5.x era of Ubuntu.

Yup definitely sounds like it. Give 6.10 a try sometime if you want, it's
quite nice. 7.04 is actually being released too 19th of next month. =)
Heh... I remember reading wads of verbage on whether ubuntu was a true
enough Debian for Debian installation techniques to be expected to
work. There are masses of per-distro details, when it comes to
predicting whether something "written for Linux" will work in your
Linux, and that's without considering compiling from source.

So I wouldn't tilt at that particular windmill, if I were you ;-)

I can actually accept some incompatabilities between vastly different
distributions. The main problem mainly being that they don't necessarily
run the same kernel version, so if you have an app that relies on
functionality provided by a certain kernel version and up...it won't run on
a distro with a kernel that is a lower version.

Same concept with shared libraries, etc.

The same problems exist in Windows too though...can't run a WinXP app on
Win95...can't run a MFC 7 app in a machine only having MFC 6 installed if
it is linked dynamically.

Reasons like that is why I *always* static link and never dynamic. I don't
care if my EXE is 1 meg larger, those functions need to be loaded into
memory anyway and I then do NOT need to worry about what version of crap
the user has installed. Stuff like that's been a support nightmare I
ditched many many years ago by static linking.

In my opinion, shared libraries, in any scenario and on any OS...while a
novel idea...are way overused, abused, and it's just way too much of a
mess. The small savings gained by dynamically linking in executable size is
trivial with todays hard drive sizes and memory availability.
I don't think that's proven until both have been tested equally.

It's like "Communism *will* work, it's just that it can't work unless
the whole world is Communist so we don't have to compete with more
efficient economies that wave shiny things at out citizens", but in
reverse, i.e. "Linux can be assumed to be malware-free only as long as
it remains a niche market".


I'm not sure how interchangeable they are, in terms of what DirectX
was designed to be, i.e. a highly-efficient gaming platform. For
starters, DirectX spans graphics, sound, LAN, HIDs whereas OpenGL is
purely a graphics API.

Well that is why I threw in OpenAL for sound. =)

You are right though, many people did switch from OpenGL to DX, once it got
to a usable state anyway and ditched that horrid immediate mode....since
it's API contains everything.

However, I did have to laugh when CreativeLabs advised developers to use
OpenAL for sound since their EAX is broken with DirectX under Vista. So MS
themselves have seen to it that they created the very problem they intended
to solve with DX.

I think in the meantime EAX does work again...but only with the X-FI card I
think.

At which point in time..I *might* get such a card in maybe...10 years? I
mean seriously...my Audigy 2 can do 7.1 surround sound...I see no need to
upgrade it anytime soon for some odd features I will never see use of.

So really, if any developer wants to use EAX...they're essentially stuck to
OpenAL now unless they want to only support the X-FI.

I don't see LAN as a problem either, there are dozens of cross-platform
libraries available that work extremely well that don't limit one to
DirectX in that department.

HID I don't know off the top of my head, I'd have to ask google.

I am using wxWidgets right now which is cross platform for everything
ranging from File IO, UI, LAN, OpenGL Integrated into it, and a bunch of
other features. It works extremely well, and my app will run on Windows,
Linux an Mac requiring nothing more than a simple compile on the target
system.

I wish more developers would use stuff like that because it would really
diminish the app compatability problems, increase the developers potential
market and ease the ability for people to try out something different from
windows for once.

Of course I am sure that MS doesn't particularly like that idea. =)

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

cquirke (MVP Windows shell/user)

cquirke (MVP Windows shell/user) wrote:
Well it only reads the first few kb or so of the text file to display its
contents, it doesn't actually execute any of the content in the process. I
don't see that causing a problem even if the content was bad.

That would be true if you could stand on a particular abstraction
layer, i.e. that code does only what it was codes to do.

However, code defects mean that what code actually does (or can be
made to do) can be insanely different.

So one can no longer assume non-trivial code will be able to view text
without running bits of it as code. Is the code trivial enough to be
assumed safe? Maybe, in an age of ASCII, but maybe not in an age of
Unicode, RTF, HTML, sundry other "rich" formats, etc.

According to a rather interesting 6-month report from Symantec, the %
of malware using code exploits is around 20% or so; the rest use
Social Engineering or by-design functionality.

This article...

http://www.internetnews.com/security/article.php/3667201

....links to the reports themselves, in .PDF form
Personally, both in the windows world and *nix world I stick to one simple
rule: Download and use files from reputable sources only. Period.

No, that's hopeless.

Firstly, the Internet is all about interactions with strangers, so
"reputable" is often meaningless. Is Sony reputable?

Secondly, we're moving away from entering URLs into browsers, to
finding stuff via search engines, so we know even less about where
we're going than we would do otherwise.

Thirdly, reputable places can be spoofed or infected.

So there is a very clear need for at least two broad levels of content
interaction; one that is safe irrespective of source, because it is
structurally limited to be so, and one that confers sufficient power
to the material that one has to have compelling reason to trust it.

Without this, safe interaction between strangers becomes impossible,
which means that much of the Internet's value is lost.

Bearing in mind the implicit "downloading of files" that applies when
browsing the web or reading email "message text"...
That has enabled me, even on windows systems, to run without *any* kind of
anti-virus or spyware protection for years.

That worked for me through the era when resident av was too "costly"
to tolerate on systems of the time. It would prolly work for me
still, but fortunately I no longer have to put that to the test.

A core reason *why* it worked as long as it did, was because I could
tell what level of risk material would pose to the system. I could
kill active content and read any web page safely, and I could read any
email safely because the message "text" was processed purely as text.

I could see what type of files were attached to email messages, and if
I didn't like the type, I'd avoid it, no matter whether it was "from
someone I know" (a useless metric) or whether the av claimed it was
"not a virus" or not.

There's no ways I'd want to work in a system where anything could be
raw code (and run as such when "opened"), or when the rich contents of
files had already been groped by the OS by the time I first saw the
file names, before indicating any intention to "open" them.
Occasionally I will actually bother to scan the system. Worst-case scenario
when running an anti virus scan after 1 year, it *may* find an attachment
in an e-mail sitting in the "Deleted Items" waiting to be trashed that
slipped past my e-mail servers scanner...

I never informally scan "the system". Specifically, I never attempt
to detect malware in active form within the infected environment.

I may do on-demand scans of subtrees in which incoming material is
concentrated, before any of it has been "opened". That makes sense...
but to scan System32 in case something's running there? Nope; if I
suspect that, then I do that "from orbit" i.e. a Bart CDR boot.
I find that simple common sense is still the best protection regardless of
the operating system.

Yep - unfortunately, in modern "rich" OSs and edge-facing apps, I'm
not the only player anymore. I dunno about Linux, but I don't trust
Windows not to wave it's networking services at the Internet, and I
don't trust code that gropes material automatically.

It's for that reason I run a resident av and firewall, though I never
assume these will catch things; IOW I'll never click something
dangerous on the ASSumption the av will catch any baddie.
Personally I figure the answer is in between both extremes. I would never
say it is impossible to infect a linux system with malware, but it is
substantially harder by design.

It seems to me that a certain complexity of code will cause a certain
problem rate, including the opportunity for malware exploit. It
doesn't matter much which OS it is, especially if they're all made
from the same raw materials, such as C.

That Symantec report is interesting; it points to Oracle as the most
exploitable database, and Sun/Solaris as the OS with the longest
time-to-patch. For the first half of 2006, Red Hat was up there with
MS as the fastest to patch, then it fell back in the second half of
2006. Safari was the slowest browser to patch, even though they had
only one defect to fix; then again, 3rd-party code was involved, so
that may have slowed them down.

77% browser exploits attempts were directed at IE, so MS gets plenty
of opportunity to build patching efficiency ;-)
Dunno, I tested it with my mom who is not a very technical person. I simply
replaced her XP laptop with my ubuntu laptop for a day...absolutely *zero*
problems. I wasn't even around to answer any questions, etc.

On a completely different PC, that's OK, but that's a luxury many of
us don't have. The rest of us would want the new OS to co-exist with
the old one, and some Linux distros are quite bad at that.

One thing I do agree on, is that "basic" users who always dive into a
few known apps from the desktop icons can do quite well on a different
OS, as long as the apps are similar. Hence the strategy of choosing
apps available across multiple platforms, as part of a more general
strategy of minimising cross-dependencies.
I can see where you are coming from for people wanting to try out linux as
dual-boot. In this particular scenario, I actually do agree with you. Then
again, the same could be said for windows too..last I checked it's default
multi-boot compatibility is "Wipe out the dual boot and install itself
only" IIRC. =)

I haven't seen that, though you're right in that a Windows
installation will generally re-assert a standard MBR, thus killing any
boot manager located there. But it won't kill other partitions, as
the Ubuntu I tried would have done by duhfault.


To the tune of "Sounds of Silence"...
--------------- ----- ---- --- -- - - -
Hello DOS mode my old friend
I've come to hack with you again
 
C

cquirke (MVP Windows shell/user)

cquirke (MVP Windows shell/user) wrote:
Sure but stuff like that I just seperate via directories. No need to create
partitions for those. =)

Oh, I'm not *that* partition-happy ;-)

I use different partitions when I want to choose different file
systems, and also to improve performance and survivability.

So I have the OS and core code in C:, along with freq-accessed stuff
like page file, temp, etc.

Then I have a small FAT16 partition for small core data, so that it's
really easy to recover; I can peel the whole volume off as raw
sectors, re-format the volume, put the PC back in the field while I
recover data from the image.

Then most of the rest is in a huge FAT32, with a smaller FAT32 at the
end for cold storage and backups. Because that last volume is "far
away", it's slow, but more to the point, it's hardly ever traversed by
the heads - so it tends to survive various misfortunes.
Well so far lots of people in this NG have popped the question "Will
QuickBooks run in Linux??"...the only reason I even really think of
quickbooks is because I've read it so many times in this NG already. =)

Ah, that context... OK. I think the point being made there is that
ppl use (and are dependent on) apps, not OSs, in the sense they choose
(or are compelled to continue to use) an app and the OS "chosen" this
thus whatever the app will run on.

This is particularly the case with accounting apps:
- costly app
- app "owns" the data, hard to switch to something else
- user generally not PC-savvy, often only know the app

In addition, local regulations break up the accounting software
market, so that a huge global shrink-wrap solution (e.g. Microsoft
Office) doesn't get traction. It would seem a great opportunity for
Open Source, but accounting apps are far too boring to write for fun,
so the only folks writing them have to be paid to do so ;-)
I never liked Access and in that regard, without even ever having opened it,
I doubt Open Office Database is any better. I've seen some *horrible* stuff
done in Access...still gives me nightmares.

Yep. After the elegance of PICK/BASIC, it's really hard to get
excited about 1-dimensional, fixed-field, hard-typed databases like
dBase and Access. To just get a parent-and-child form that lets you
flip through parents (or search them) seems really hard, even if you
do get the parent (clients) and child (invoices) tables set up and
linked to each appropriately.

And for some reason, database UIs always feel flabby and awful, like a
house made by bolting sheets of airplane wreckage together. Keyboard
navigation always seems to stick in the wrong place, etc.
At least if using ubuntu, it's the menu next to "Applications" "Places"
staring you in the face on the top left ;)


Alt + F1 =)

Ah, reminds me of Word Perfect's arbitrary keystroke UI ;-)

Does Alt+F1 work across all Linuxen, or is it Ubuntu-specific? If I
have to PLOKTA, which keystrokes are good ones to guess first?

(PLOKTA = Press Lots Of Keys To Abort)
The entire system can't really go "busy" here, the multi tasking is a little
different than under windows. An appliation can't hog the entire CPU and
make the OS unresponsive like it can in windows.

Cool. In Windows, I find the mouse pointer survives almost anything,
but while it can roam around quite happily, mouse clicks don't cut
through as well as keystrokes. Also, while the mouse pointer
survives, there's often nothing left for it to click :)
Across distros? No. There is a major difference between Ubuntu and Kubuntu
alone, not to mention other distros. Kubuntu is a lot more windows-like
(even has a control panel!) than Ubuntu.

Ewww... OK.
There are multiple options I do think. I'd have to look it up myself.


As the syntax is going to depend on what you use, the answer will
depend on that.

Will I get help via InterpreterName -? or Man InterpreterName ?
I personally do very little scripting in general, I generally implement
everything in C/C++ and just create an executable to do the job. =)

Ahh... I wish I'd learned C/C++; I bounced off it after reading Help
files for about a year. I loved the language, but hated having to use
other ppl's code libraries via 27-sequential-params calls.

So I just use .BAT and .CMD, rather than even VBScript or VScript. My
background (Spectrum, PICK) predisposes me to BASIC, but that is such
a tribal language, with the obverse of "27 sequential parameters";
user-friendly reserved "glue" words that differ between tribes!

Pick your pain, I guess...
From a terminal:
man CommandName
or CommandName --help (usually)
Cool.
I am sure there is a path-like thing somewhere that defines the default
search paths when you go try to run a file like the PATH= in DOS. Seeing as
I've not yet had the need to modify it though...I haven't really bothered
to look into it.

OK. I'm moving away from Path dependence by using .CMD's enhanced
syntax; the ugly-but-beautiful %~dp0 allows one to specify the path of
"where you are" from the batch file's perspective, and thus can be
used to point to any engines etc. in the same place.

The nice thing is, you don't have to know where that place is, and you
don't have to switch away from where you are, in terms of where the
batch file was invoked from.
See that's one thing I don't have to worry about here. =) This stuff works
the same way now as it did several verisons ago and it will still work the
same way several versions later.

That's nice. The most brilliant cross-version consistency I've seen
yet, has been Eudora; you can drop a eudora 7.1 data set into Eudora
3.xx and it will prolly work. I've done version back-stretches from
5.1 to 3.xx and from 7.1 to 5.0.x, hence the deduced assumption.

Contrast that with PoS OE 5, which will irreversably convert OE 4 data
to OE 5 without much of a warning or "may I?". Bleh.
File extensions are meaningless though. Anyone can take *any* file and give
it *any* extension...so what good does that do? To me it's nothing more
than part of the file's name.

That's quite unacceptably dangerous, IMO. Windows is heading that
way, with dumb-ass duhfaults like "hide file name extensions" and
"open based on content, not file name", but one can still beat it into
shape. Dunno why MS is so hell-bent on destroying one of Windows'
biggest safety benefits over the competition, but there you are.

Unless there's something else that does the same job, I'd describe
this *NIX behavior as dangerously clueless.
That said, I did forget about this when I first wrote the response. You
actually can differentiate executables even on the commandline level.
When I get a directory listing, it is displayed the following way:
- Regular file in white
- Directory in blue
- Executable in green
So any green file, regardless of extension, is marked as executable.

That's nice, if a bit risk-oblivious... green, for the most dangerous
file type? Seems like *NIX doesn't "get" the risk-riddled world yet?
Yes you can. If say you downloaded that file from the net and saved it on
your desktop. It will be saved as a normal file. It will *not* be
executable under any circumstances. Even if you changed it's extension to
EXE...that extension is meaningless around here.

But if it was code in a file called blahblah.TXT, what then?
To execute the file, you'd have to manually go to the file properties
(accessible via the right click menu just as in windows) and *mark* it as
executable yourself and then execute it.
This only applies to manually downloaded files though. It does not apply to
applications installed via a package (think of it similar to the MSI
installer under windows) as in those cases the package manager will do
everything necessary and you just click the icon once the app is installed.

OK, now I get it (I think) - the process of downloading the file
forces it to have non-executable attributes (alas, hidden) applied to
it, even if it was originally a code file with code attributes.

That, I like.
Of course making sure the package contents are not bad for your system is up
to *you* as the user to decide. There is only so much any OS can do.

Sure. If I download what is clearly and visibly code, and run it,
then the OS has done it's job of presenting me the info I needed to
make an informed choice. It's when the info I need is hidden from me,
or bad OS design allows content to mis-represent itself, that I get
pissed off and blame the OS, to a variable extent.
Sure, malware could technically go and wipe out my entire home directory.
There's not much in place preventing it from doing that. But where do you
draw the line? It's got to be drawn somewhere else you end up with UAC and
Vista.

If "ending up with UAC and Vista" means not having malware trash my
data, then that would be a small price to pay, but of course it's not
as simple as that. Personally, I think the whole notion of user
accounts is a bad fit for the consumer world... it's like grabbing
handfulls of pills at a hospital that are labelled according to who
made them, rather than what the active ingredients do.
At the very minimum though, the OS itself and all applications remain
unaffacted and the system remains in a usable state.

Nice for vendors, shame about the users.
Anyone relying on their data being safe on their home PC is asking for
trouble, malware or not. Hard disk failures do happen...I've had plenty.

Sure, and there are various ways to hedge that, but backup is not a
cure-all. Every backup is by definition different to the live state,
and thus it may lose some wanted changes, else it would *be* the live
state and equally subject to whatever killed that.

In particular, problems arise when there's a long lag between
undesirable changes losing data, and discovering that this has
happened. The universal scope of time won't help there; you need
other scopes, and right now, Windows is still weak on scoping in data
and scoping out other unwanted stuff.
Yup definitely sounds like it. Give 6.10 a try sometime if you want, it's
quite nice. 7.04 is actually being released too 19th of next month. =)

Hm, maybe I'll have enough cap to get it! I used up this month's cap
pulling down a "live" ISO of PCLinuxOS.
I can actually accept some incompatabilities between vastly different
distributions. The main problem mainly being that they don't necessarily
run the same kernel version, so if you have an app that relies on
functionality provided by a certain kernel version and up...it won't run on
a distro with a kernel that is a lower version.
Ewww....

The same problems exist in Windows too though...can't run a WinXP app on
Win95...can't run a MFC 7 app in a machine only having MFC 6 installed if
it is linked dynamically.

I see that Vista's SxS is as big as its System32, so maybe MS is
unwinding the .DLL concept to localizing "shared" code on a per-app
basis. This is part of a general trend away from "ooh, let's do this,
it's really lean and efficient" to "let's do this properly, even if it
bloats up the footprint". Y2k = Exhibit A... them two "saved" bytes
ended up costing everyone plenty.
In my opinion, shared libraries, in any scenario and on any OS...while a
novel idea...are way overused, abused, and it's just way too much of a
mess. The small savings gained by dynamically linking in executable size is
trivial with todays hard drive sizes and memory availability.

Ah, I see GMTA on this one ;-)
HID I don't know off the top of my head, I'd have to ask google.

Human Interface Devices don't seem to me to be a big speed burden, if
they work at human speed <g>

It's more a matter of supporting unforseen HIDs.


------------ ----- ---- --- -- - - - -
Our senses are our UI to reality
 
S

Stephan Rose

cquirke said:
Then most of the rest is in a huge FAT32, with a smaller FAT32 at the
end for cold storage and backups. Because that last volume is "far
away", it's slow, but more to the point, it's hardly ever traversed by
the heads - so it tends to survive various misfortunes.

I think you're the first person I've met that uses FAT32 these days. =)
Ah, that context... OK. I think the point being made there is that
ppl use (and are dependent on) apps, not OSs, in the sense they choose
(or are compelled to continue to use) an app and the OS "chosen" this
thus whatever the app will run on.

This is particularly the case with accounting apps:
- costly app
- app "owns" the data, hard to switch to something else
- user generally not PC-savvy, often only know the app

In addition, local regulations break up the accounting software
market, so that a huge global shrink-wrap solution (e.g. Microsoft
Office) doesn't get traction. It would seem a great opportunity for
Open Source, but accounting apps are far too boring to write for fun,
so the only folks writing them have to be paid to do so ;-)

Maybe I'll look at it once my current project is done. =) I am still
debating with myself what to work on next after I finish my current side
project in a few months.

Though Quicken *is* among the top list of things WINE Developers are trying
to get working.
Yep. After the elegance of PICK/BASIC, it's really hard to get
excited about 1-dimensional, fixed-field, hard-typed databases like
dBase and Access. To just get a parent-and-child form that lets you
flip through parents (or search them) seems really hard, even if you
do get the parent (clients) and child (invoices) tables set up and
linked to each appropriately.

And for some reason, database UIs always feel flabby and awful, like a
house made by bolting sheets of airplane wreckage together. Keyboard
navigation always seems to stick in the wrong place, etc.

Company I am working for now is currently using a SQL Database with an
Access Front end to view and modify their data. As soon as I move and am
located by their main office my first order business will be putting up a
nice MySQL Database, clean up the current SQL Mess and write a real front
end with a real UI!
Ah, reminds me of Word Perfect's arbitrary keystroke UI ;-)

Does Alt+F1 work across all Linuxen, or is it Ubuntu-specific? If I
have to PLOKTA, which keystrokes are good ones to guess first?

Not sure on that. It is probably more desktop manager specific than distro
specific. Meaning that it probably works with any distribution running
Gnome as its UI such as Ubuntu is. As far as other desktop managers go, I
am not sure. I've used KDE before but have never tried that there.

Alt+F4 though works the same as it does in Windows both in Gnome and KDE,
and presumably other managers as well. =)
(PLOKTA = Press Lots Of Keys To Abort)



Cool. In Windows, I find the mouse pointer survives almost anything,
but while it can roam around quite happily, mouse clicks don't cut
through as well as keystrokes. Also, while the mouse pointer
survives, there's often nothing left for it to click :)

Well if things really go bad and unresponsive, but at least keyboard input
is still there, you can always use ctrl+alt+backspace to kill the UI and
get down to command prompt only level. From there then you could either
restart the UI or restart the system....kill processes if necessary..or do
anything else you want.
Ewww... OK.

Well one thing I should say is this. While there are a lot of distributions,
there actually are only very few UIs. Your main 2 being KDE and Gnome I
think, and a couple others. Usually, any distro using the same UI will
behave the same from an UI standpoint. Looks might vary though depending on
the theme that is set.
Will I get help via InterpreterName -? or Man InterpreterName ?

Depends on the interpreter but usually both ways actually.

I think bash & perl are prolly the most common.
Ahh... I wish I'd learned C/C++; I bounced off it after reading Help
files for about a year. I loved the language, but hated having to use
other ppl's code libraries via 27-sequential-params calls.

Yea I generally refrain from using too much 3rd party stuff.

The only 3rd party libraries I use these days are wxWidgets for my
cross-platform stuff, MySQL API for database and OpenGL for accelerated
graphics.

All are very well done and I guess I have not much of a choice in the last
2. =)
That's nice. The most brilliant cross-version consistency I've seen
yet, has been Eudora; you can drop a eudora 7.1 data set into Eudora
3.xx and it will prolly work. I've done version back-stretches from
5.1 to 3.xx and from 7.1 to 5.0.x, hence the deduced assumption.

Very nice, the file format for the application I am working on is also
designed to be forward compatible in a similar way. Though, it is always
possible that a user may save a feature that simply didn't exist in the
previous versions. In that event my file handler can skip the unknown data
but obviously the output in that scenario may be incorrect.

Forward compatibility is difficult. =)
Contrast that with PoS OE 5, which will irreversably convert OE 4 data
to OE 5 without much of a warning or "may I?". Bleh.



That's quite unacceptably dangerous, IMO. Windows is heading that
way, with dumb-ass duhfaults like "hide file name extensions" and
"open based on content, not file name", but one can still beat it into
shape. Dunno why MS is so hell-bent on destroying one of Windows'
biggest safety benefits over the competition, but there you are.

I don't quite understand how extensions are in any way safer than no
extensions. Since a file can be renamed and its extension changed, be that
DOS, Windows, Linux, etc. the extension is meaningless as there is no way
from just looking at the file name to know if that is the correct
extension.

I do agree though, the default for "Hide Extensions" is stupid in windows.
Always the first thing I turn off.
Unless there's something else that does the same job, I'd describe
this *NIX behavior as dangerously clueless.





That's nice, if a bit risk-oblivious... green, for the most dangerous
file type? Seems like *NIX doesn't "get" the risk-riddled world yet?

Dunno, I don't really worry about it. I generally know what I am executing
and why regardless of file name color. =)

OK, now I get it (I think) - the process of downloading the file
forces it to have non-executable attributes (alas, hidden) applied to
it, even if it was originally a code file with code attributes.

That, I like.

Yup that's the concept. Something that is completely non-existant in Windows
file systems. Windows decides based on the extension if a file can be
executed. Here...*I* have to decide if a file can be executed. That's why
file extensions aren't that big of a deal as they have no meaning to the
operating system.
Sure. If I download what is clearly and visibly code, and run it,
then the OS has done it's job of presenting me the info I needed to
make an informed choice. It's when the info I need is hidden from me,
or bad OS design allows content to mis-represent itself, that I get
pissed off and blame the OS, to a variable extent.



If "ending up with UAC and Vista" means not having malware trash my
data, then that would be a small price to pay, but of course it's not
as simple as that. Personally, I think the whole notion of user
accounts is a bad fit for the consumer world... it's like grabbing
handfulls of pills at a hospital that are labelled according to who
made them, rather than what the active ingredients do.

But does UAC really prevent it? I don't think so. Accidentially click "Yes"
or "Allow" or whatever once in the wrong moment....and the damage is done.
So from that perspective, UAC doesn't prevent much of anything.

Save for some exploit beyond my control, it's virtually impossible for me to
accidentally run malware from ubuntu. I'd have to download it, manually
make it executable and then I have to go and execute it myself! That's far
more difficult to do accidentally than clicking "Allow".

Nice for vendors, shame about the users.

Well even as a user I still like it as it means my system is still
functional VS both my data and my system being toast.
Sure, and there are various ways to hedge that, but backup is not a
cure-all. Every backup is by definition different to the live state,
and thus it may lose some wanted changes, else it would *be* the live
state and equally subject to whatever killed that.

Nope, not necessarily true. My backup is always 100% up to date with my live
copy. Really, the "live" copy is on the server, not on my client. But since
I *generate* the data on the client, not on the server my computer is
always in the same state as my server.

Everytime I go to modify a file, I actually have to check it out...make my
changes..then check it back in. So any changes I make are comitted to my
server whenever I am done with the file. That results in both datasets
always being an exact mirror of each other.

The only thing I could possibly ever loose is if my computer were to go
toast while I am editing checked out files. The server doesn't get the
changes until I check the files in. At worst though, that might mean a
couple hours lost, disregarding the time to fix the system. =)
Hm, maybe I'll have enough cap to get it! I used up this month's cap
pulling down a "live" ISO of PCLinuxOS.

A cap!?!? You actually have a cap? Wow....running on Satellite or something?

Well stuff like that is inevitable. Any OS progresses and new features are
added. If an application needs a certain feature of a newer version then it
will be difficult to run it on an older version OS that does not have that
feature.

Something like that I find perfectly acceptable. Forward compatibility is
exceedingly difficult to achieve and in many cases plain impossible.

So even in the windows world, I am perfectly OK with Win95 not running an
app that requires WinXP.

What ticks me off with windows though is that a Win95 app won't necessarily
run on XP and even less likely on Vista. The backwards compatibility is
just pathetic, especially with Vista where not even XP apps may run right.
That is one problem linux doesn't have. You can normally always run an
older app on a newer version. Matter of fact, older version of various
libraries are even available in the package manager just for that exact
reason. Only installed though if you actually use an app that needs them.
I see that Vista's SxS is as big as its System32, so maybe MS is
unwinding the .DLL concept to localizing "shared" code on a per-app
basis. This is part of a general trend away from "ooh, let's do this,
it's really lean and efficient" to "let's do this properly, even if it
bloats up the footprint". Y2k = Exhibit A... them two "saved" bytes
ended up costing everyone plenty.

Don't even get me started on SxS....It's no different than previousl DLLs
actually, except that you can no longer include them with your app. They
have to be installed on the system. But they are still dynamically linked
by default like everything has always been.

I am pissed though at that crap because when I updated Visual Studio 2005
with Service Pack 1, all my C++ apps now use SxS! I didn't even realize it
until I went to test my app on a different machine and wondered WHY the
heck it won't run!

What's even more laughable is that they don't even put an installer in the
redist directory of Visual Studio. Just the raw DLLs which are useless as
you can't just bundle them with the app. They have to be installed via the
non-existing intaller. I found one installer on MS' website for it but I've
yet to actually see it work.

So now I just statically link that crap and things work...but I am pissed
beyond belief over it. I've though about wiping Visual Studio and
reinstalling it to get rid of the SxS dependancy but I will only be using
that computer for another month or so. Not worth the trouble...

After that I'll mostly be developing from ubuntu anyway with a much lower
blood pressure. I'll just dual boot windows or keep it on a seperate
machine, whatever is easier, to generate the windows builds of my apps.

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

cquirke (MVP Windows shell/user)

Then most of the rest is in a huge FAT32, with a smaller FAT32 at the
end for cold storage and backups. Because that last volume is "far
away", it's slow, but more to the point, it's hardly ever traversed by
the heads - so it tends to survive various misfortunes.

I think you're the first person I've met that uses FAT32 these days. =)[/QUOTE]

I like both FAT16 and FAT32 for what they do best.

FAT16 is nice for where you want to store small but crucial files, as
the large clusters mean files may be recoverable even if the FAT is
totally trashed (i.e. if they fit within one cluster). The FAT is
small, so manually fixing it isn't too heavy going, and the whole
volume is small enough to evacuate in toto.

FAT32 is nice for more general use, if you don't want to abandon all
chances of data recovery. The file system structures are all
concentrated at the front of the volume (tho the root directory may
fragment across the volume space) so you can do and undo multiple
alternate FAT repairs by simply backing up the raw sectors that
include the FATs etc. Very nice.

NTFS is a more efficient file system in ways that matter in 2007+; can
hold large files, can hold huge numbers of directory entries, and the
other extra overheads are better carried by today's fatter hardware.
It's also more efficient for large volumes, as instead of listing
every cluster address as FATxx does, it stores only a
1-bit-per-cluster "used" bitmap and the start and length of each
contiguous run of clusters.

Unfortunately, by the time NTFS was forced into consumerland, MS had
abandoned any pretence at providing recovery or user-controlled file
system repair tools. Combine that with the closed nature of NTFS -
subject to arbitrary change at any moment, undocumented - and the
prospects of data recovery become dim indeed.

So if you avoided disk compression due to concerns about data
recoverability, NTFS gives pleny of cause for pause.
Maybe I'll look at it once my current project is done. =) I am still
debating with myself what to work on next after I finish my current side
project in a few months.

Though Quicken *is* among the top list of things WINE Developers are trying
to get working.

That could be tough - I suspect out of all the accounting apps,
QuickBooks (dunno about Quicken) is the one with the tightest
dependency on IE. Even Windows can't run QuickBooks if IE has been
vserioned or patched in ways that QB gets hissy about ;-)

I haven't seen many other accounting apps, but it's hard to imagine
how one could possibly chain an app to IE'd concrete shoes any more
than Quick Books circa 2004 had done.
Company I am working for now is currently using a SQL Database with an
Access Front end to view and modify their data. As soon as I move and am
located by their main office my first order business will be putting up a
nice MySQL Database, clean up the current SQL Mess and write a real front
end with a real UI!

I was surprised to see (in that Symantec report) that MS SQL was the
only database engine they listed (they listed about 5, including
MySQL, with Oracle going off-scale in number of exploitable defects)
that had no exploitable defects at all, in the 6-month period.

MySQL wasn't bad, tho, especially compared to Oracle. I dunno what
the patch turn-around times were; not sure if they were reported.

Ah, a well-front-ended database engine - that would be a killer app!
Not sure on that. It is probably more desktop manager specific than distro

OK, I get it; it's an X-Windows/GUI thing, not a kernel thing.
Well if things really go bad and unresponsive, but at least keyboard input
is still there, you can always use ctrl+alt+backspace to kill the UI and
get down to command prompt only level. From there then you could either
restart the UI or restart the system....kill processes if necessary..or do
anything else you want.

O..K.. no that's good. Might be nice to have a tiny ultra-lean GUI
pop up as a Task Manager equivalent at such times. Means you have
only one CLI command to memorize ;-)
Well one thing I should say is this. While there are a lot of distributions,
there actually are only very few UIs. Your main 2 being KDE and Gnome I
think, and a couple others. Usually, any distro using the same UI will
behave the same from an UI standpoint. Looks might vary though depending on
the theme that is set.

OK, fair enough, and good to know. My hunch is that the distro
differences may bite when you try to download and install 3rd-party
apps that weren't in the original bundle?
Very nice, the file format for the application I am working on is also
designed to be forward compatible in a similar way. Though, it is always
possible that a user may save a feature that simply didn't exist in the
previous versions. In that event my file handler can skip the unknown data
but obviously the output in that scenario may be incorrect.

There's been a more pervasive awareness of the need to ignore what is
not understood, through the various tagged file formats. Eudora does
seem to get that right... as there's a lot new that was added between
3.xx and 7.1... oh, BTW: Eudora's now Open Source ;-)
Forward compatibility is difficult. =)

It can be done, though Firefox may have tripped up a bit on the
settings front. I always harden Firefox a bit when installing it,
including setting it to block attempts to install plugins and
software. After upgrading Firefox to a newer version, I had
complaints that it couldn't pull down stuff even though all the new UI
settings were set to allow this.

These settings were a bit different to what was there before, and I
suspect they set different items in Firefox's .INI or equivalent - so
that there's no longer an UI for the settings set by the older
version. This is no problem if the new version's installed "fresh",
or over an old verseion where the setting wasn't changed and so did
not exist (falling back to default behavior) but becomes a problem if
an old setting becomes a "UI orphan".
I don't quite understand how extensions are in any way safer than no
extensions. Since a file can be renamed and its extension changed, be that
DOS, Windows, Linux, etc. the extension is meaningless as there is no way
from just looking at the file name to know if that is the correct extension.

It depends. A safe OS would act on the same info that was presented
to you, when you took the decision to "open" it.

So if you were to put raw code in a ".TXT" file, a safe OS wouldn't
care whether the file was internally code - it would just pring it up
as text in a viewer or text editor.

However, a badly-designed OS would detect the internal (hidden) file
type and think "oh, the writer seemed to make an honest mistake,
as this code file is named as if it was text - let's help the
programmer by running it as code - no need to ask the user first,
after all, why wouldn't one want to run code hidden as text?"
I do agree though, the default for "Hide Extensions" is stupid in windows.
Always the first thing I turn off.

Me2 - I forget it's the duhfault for the Great Unwashed. It's *still*
the duhfault in Vista, which is even harder to figure.
Yup that's the concept. Something that is completely non-existant in Windows
file systems. Windows decides based on the extension if a file can be
executed. Here...*I* have to decide if a file can be executed. That's why
file extensions aren't that big of a deal as they have no meaning to the
operating system.

I see it a bit differently - i.e. that both the OS and the user should
see and act on the same info, and be bound by that info. IOW, I
should see the same ifo the OS is acting on before I choose to "open"
the file, and the OS should be bound by that info, and not
automatically act on hidden info where this says something else.

It's nice if an OS is context-sensitive enough to auto-limit things,
but this may create opportunities to spoof past.
But does UAC really prevent it? I don't think so. Accidentially click "Yes"
or "Allow" or whatever once in the wrong moment....and the damage is done.
So from that perspective, UAC doesn't prevent much of anything.

It's still better than "trust me, I'm an Operating System".
Save for some exploit beyond my control, it's virtually impossible for me to
accidentally run malware from ubuntu. I'd have to download it, manually
make it executable and then I have to go and execute it myself! That's far
more difficult to do accidentally than clicking "Allow".

It may also be just too difficult for current Windows users to get
their head around. I can see it now... "download this tool to allow
downloaded files to run as code without having to be converted!"
The only thing I could possibly ever loose is if my computer were to go
toast while I am editing checked out files. The server doesn't get the
changes until I check the files in. At worst though, that might mean a
couple hours lost, disregarding the time to fix the system. =)

Yup, there you go. Or whatever mishap you wanted to avoid, may be
committed to the server along with your wanted saves.
A cap!?!? You actually have a cap? Wow....running on Satellite or something?

512k ADSL, as passed through the sphincter of our telecomms monopoly.
Caps are set at 1G, 3G etc. and I have the "big" 3G cap as well as the
"fast" 512k ADSL. Still, it's sooooo much better than dial-up hell,
especially as out telco charges per second for local calls to our ISPs
Well stuff like that is inevitable. Any OS progresses and new features are
added. If an application needs a certain feature of a newer version then it
will be difficult to run it on an older version OS that does not have that
feature.

Something like that I find perfectly acceptable. Forward compatibility is
exceedingly difficult to achieve and in many cases plain impossible.

So even in the windows world, I am perfectly OK with Win95 not running an
app that requires WinXP.

What ticks me off with windows though is that a Win95 app won't necessarily
run on XP and even less likely on Vista. The backwards compatibility is
just pathetic, especially with Vista where not even XP apps may run right.

Hmm. What you are looking at, is a switch from feature expansion (the
old "computers are wonderful, what can we add next?" mindset) to risk
control (the new "computers are dangerous, what existing bad practices
do we have to exclude?" mindset).

Just about every minor "upgrade" these days - be it Java, Firefox, IE,
Windows, etc. - is necessitated by external events that require the
deliberate breaking of backwards compatibility (so that exploits
crafted against the older exploitable code base will no longer work).

MS gets this, big time. Sun is only just getting a clue... they used
to retain old (exploitable) JREs and silently pass through to them any
code that requested this ("hi, I'm malware and I need to exploit JRW
1.4.xx in order to work" 'OK, go ahead malware, here's 1.4.xx') until
as recently as JRE 1.5.006, and they still don't remove old JREs (at
100M+ a pop) when adding new ones. They were also the slowest OS
vendor (yep, slower than "what, me worry?" Apple) to patch exploitable
defects, according to that Symantec report I mentioned earlier.

So; no, it no longer surprises me that old sware doesn't run on new
OSs. MS have made some stunnigly bad design decisions in the past, so
there are a lot of rabbits to stuff back in Pandora's Box.

But at least we're moving in the right direction, even if it is under
the whip of ITW exploits.

Imagine what Linux would be like, if everyone who wrote anything for
it, insisted on having root access. You'd have this great secure OS,
but users wouldn't be able to use that security because everything
they wanted to run would insist on dropping that protection.

This is where MS found themselves at the end of the XP era. They'd
pushed NT into the Win9x space and told devs to change the way they
wrote apps so that they'd work in NT's limited user accounts.

But devs basically told MS to sod off; because things still worked
when writing as if for Win9x, only the most perfunctionary changes
were made to cater for XP. This smoothed the transition from Win9x to
XP (it may be to the point that today's transition from XP to Vista is
rougher, even though the OSs have more in common) but meant that Vista
has to face the same problems as XP.

So, this time MS are cracking the whip. If you still write apps for
Windows as if it were still "everyone loves admin" Win9x, your apps
WILL break, or they WILL piss off users with incessant UAC prompts.

If that means that the bulk of NT's security benefits finally reach
consumers, then that's prolly pain worth enduring.
Don't even get me started on SxS....It's no different than previousl DLLs
actually, except that you can no longer include them with your app. They
have to be installed on the system. But they are still dynamically linked
by default like everything has always been.

OK; I've been too long out of coding to really understand or comment
on your objections, and I suspect that would be another thread :)


--------------- ----- ---- --- -- - - -
Who is General Failure and
why is he reading my disk?
 
G

Guest

IE does for downloads, OLE for data, and Windows shell for files not
registered content sniff the data to determine type as their first option.

For example extension is the first thing Explorer looks at, the last thing
OLE looks at, and how IE does it is in a KB somewhere - IE relies on Server
type, content sniffing, and extensions.
cquirke (MVP Windows shell/user) said:
I think you're the first person I've met that uses FAT32 these days. =)

I like both FAT16 and FAT32 for what they do best.

FAT16 is nice for where you want to store small but crucial files, as
the large clusters mean files may be recoverable even if the FAT is
totally trashed (i.e. if they fit within one cluster). The FAT is
small, so manually fixing it isn't too heavy going, and the whole
volume is small enough to evacuate in toto.

FAT32 is nice for more general use, if you don't want to abandon all
chances of data recovery. The file system structures are all
concentrated at the front of the volume (tho the root directory may
fragment across the volume space) so you can do and undo multiple
alternate FAT repairs by simply backing up the raw sectors that
include the FATs etc. Very nice.

NTFS is a more efficient file system in ways that matter in 2007+; can
hold large files, can hold huge numbers of directory entries, and the
other extra overheads are better carried by today's fatter hardware.
It's also more efficient for large volumes, as instead of listing
every cluster address as FATxx does, it stores only a
1-bit-per-cluster "used" bitmap and the start and length of each
contiguous run of clusters.

Unfortunately, by the time NTFS was forced into consumerland, MS had
abandoned any pretence at providing recovery or user-controlled file
system repair tools. Combine that with the closed nature of NTFS -
subject to arbitrary change at any moment, undocumented - and the
prospects of data recovery become dim indeed.

So if you avoided disk compression due to concerns about data
recoverability, NTFS gives pleny of cause for pause.
Maybe I'll look at it once my current project is done. =) I am still
debating with myself what to work on next after I finish my current side
project in a few months.

Though Quicken *is* among the top list of things WINE Developers are
trying
to get working.

That could be tough - I suspect out of all the accounting apps,
QuickBooks (dunno about Quicken) is the one with the tightest
dependency on IE. Even Windows can't run QuickBooks if IE has been
vserioned or patched in ways that QB gets hissy about ;-)

I haven't seen many other accounting apps, but it's hard to imagine
how one could possibly chain an app to IE'd concrete shoes any more
than Quick Books circa 2004 had done.
Company I am working for now is currently using a SQL Database with an
Access Front end to view and modify their data. As soon as I move and am
located by their main office my first order business will be putting up a
nice MySQL Database, clean up the current SQL Mess and write a real front
end with a real UI!

I was surprised to see (in that Symantec report) that MS SQL was the
only database engine they listed (they listed about 5, including
MySQL, with Oracle going off-scale in number of exploitable defects)
that had no exploitable defects at all, in the 6-month period.

MySQL wasn't bad, tho, especially compared to Oracle. I dunno what
the patch turn-around times were; not sure if they were reported.

Ah, a well-front-ended database engine - that would be a killer app!
Not sure on that. It is probably more desktop manager specific than distro

OK, I get it; it's an X-Windows/GUI thing, not a kernel thing.
Well if things really go bad and unresponsive, but at least keyboard input
is still there, you can always use ctrl+alt+backspace to kill the UI and
get down to command prompt only level. From there then you could either
restart the UI or restart the system....kill processes if necessary..or do
anything else you want.

O..K.. no that's good. Might be nice to have a tiny ultra-lean GUI
pop up as a Task Manager equivalent at such times. Means you have
only one CLI command to memorize ;-)
Well one thing I should say is this. While there are a lot of
distributions,
there actually are only very few UIs. Your main 2 being KDE and Gnome I
think, and a couple others. Usually, any distro using the same UI will
behave the same from an UI standpoint. Looks might vary though depending
on
the theme that is set.

OK, fair enough, and good to know. My hunch is that the distro
differences may bite when you try to download and install 3rd-party
apps that weren't in the original bundle?
Very nice, the file format for the application I am working on is also
designed to be forward compatible in a similar way. Though, it is always
possible that a user may save a feature that simply didn't exist in the
previous versions. In that event my file handler can skip the unknown data
but obviously the output in that scenario may be incorrect.

There's been a more pervasive awareness of the need to ignore what is
not understood, through the various tagged file formats. Eudora does
seem to get that right... as there's a lot new that was added between
3.xx and 7.1... oh, BTW: Eudora's now Open Source ;-)
Forward compatibility is difficult. =)

It can be done, though Firefox may have tripped up a bit on the
settings front. I always harden Firefox a bit when installing it,
including setting it to block attempts to install plugins and
software. After upgrading Firefox to a newer version, I had
complaints that it couldn't pull down stuff even though all the new UI
settings were set to allow this.

These settings were a bit different to what was there before, and I
suspect they set different items in Firefox's .INI or equivalent - so
that there's no longer an UI for the settings set by the older
version. This is no problem if the new version's installed "fresh",
or over an old verseion where the setting wasn't changed and so did
not exist (falling back to default behavior) but becomes a problem if
an old setting becomes a "UI orphan".
I don't quite understand how extensions are in any way safer than no
extensions. Since a file can be renamed and its extension changed, be that
DOS, Windows, Linux, etc. the extension is meaningless as there is no way
from just looking at the file name to know if that is the correct
extension.

It depends. A safe OS would act on the same info that was presented
to you, when you took the decision to "open" it.

So if you were to put raw code in a ".TXT" file, a safe OS wouldn't
care whether the file was internally code - it would just pring it up
as text in a viewer or text editor.

However, a badly-designed OS would detect the internal (hidden) file
type and think "oh, the writer seemed to make an honest mistake,
as this code file is named as if it was text - let's help the
programmer by running it as code - no need to ask the user first,
after all, why wouldn't one want to run code hidden as text?"
I do agree though, the default for "Hide Extensions" is stupid in windows.
Always the first thing I turn off.

Me2 - I forget it's the duhfault for the Great Unwashed. It's *still*
the duhfault in Vista, which is even harder to figure.
Yup that's the concept. Something that is completely non-existant in
Windows
file systems. Windows decides based on the extension if a file can be
executed. Here...*I* have to decide if a file can be executed. That's why
file extensions aren't that big of a deal as they have no meaning to the
operating system.

I see it a bit differently - i.e. that both the OS and the user should
see and act on the same info, and be bound by that info. IOW, I
should see the same ifo the OS is acting on before I choose to "open"
the file, and the OS should be bound by that info, and not
automatically act on hidden info where this says something else.

It's nice if an OS is context-sensitive enough to auto-limit things,
but this may create opportunities to spoof past.
But does UAC really prevent it? I don't think so. Accidentially click
"Yes"
or "Allow" or whatever once in the wrong moment....and the damage is done.
So from that perspective, UAC doesn't prevent much of anything.

It's still better than "trust me, I'm an Operating System".
Save for some exploit beyond my control, it's virtually impossible for me
to
accidentally run malware from ubuntu. I'd have to download it, manually
make it executable and then I have to go and execute it myself! That's far
more difficult to do accidentally than clicking "Allow".

It may also be just too difficult for current Windows users to get
their head around. I can see it now... "download this tool to allow
downloaded files to run as code without having to be converted!"
The only thing I could possibly ever loose is if my computer were to go
toast while I am editing checked out files. The server doesn't get the
changes until I check the files in. At worst though, that might mean a
couple hours lost, disregarding the time to fix the system. =)

Yup, there you go. Or whatever mishap you wanted to avoid, may be
committed to the server along with your wanted saves.
A cap!?!? You actually have a cap? Wow....running on Satellite or
something?

512k ADSL, as passed through the sphincter of our telecomms monopoly.
Caps are set at 1G, 3G etc. and I have the "big" 3G cap as well as the
"fast" 512k ADSL. Still, it's sooooo much better than dial-up hell,
especially as out telco charges per second for local calls to our ISPs
Well stuff like that is inevitable. Any OS progresses and new features are
added. If an application needs a certain feature of a newer version then
it
will be difficult to run it on an older version OS that does not have that
feature.

Something like that I find perfectly acceptable. Forward compatibility is
exceedingly difficult to achieve and in many cases plain impossible.

So even in the windows world, I am perfectly OK with Win95 not running an
app that requires WinXP.

What ticks me off with windows though is that a Win95 app won't
necessarily
run on XP and even less likely on Vista. The backwards compatibility is
just pathetic, especially with Vista where not even XP apps may run right.

Hmm. What you are looking at, is a switch from feature expansion (the
old "computers are wonderful, what can we add next?" mindset) to risk
control (the new "computers are dangerous, what existing bad practices
do we have to exclude?" mindset).

Just about every minor "upgrade" these days - be it Java, Firefox, IE,
Windows, etc. - is necessitated by external events that require the
deliberate breaking of backwards compatibility (so that exploits
crafted against the older exploitable code base will no longer work).

MS gets this, big time. Sun is only just getting a clue... they used
to retain old (exploitable) JREs and silently pass through to them any
code that requested this ("hi, I'm malware and I need to exploit JRW
1.4.xx in order to work" 'OK, go ahead malware, here's 1.4.xx') until
as recently as JRE 1.5.006, and they still don't remove old JREs (at
100M+ a pop) when adding new ones. They were also the slowest OS
vendor (yep, slower than "what, me worry?" Apple) to patch exploitable
defects, according to that Symantec report I mentioned earlier.

So; no, it no longer surprises me that old sware doesn't run on new
OSs. MS have made some stunnigly bad design decisions in the past, so
there are a lot of rabbits to stuff back in Pandora's Box.

But at least we're moving in the right direction, even if it is under
the whip of ITW exploits.

Imagine what Linux would be like, if everyone who wrote anything for
it, insisted on having root access. You'd have this great secure OS,
but users wouldn't be able to use that security because everything
they wanted to run would insist on dropping that protection.

This is where MS found themselves at the end of the XP era. They'd
pushed NT into the Win9x space and told devs to change the way they
wrote apps so that they'd work in NT's limited user accounts.

But devs basically told MS to sod off; because things still worked
when writing as if for Win9x, only the most perfunctionary changes
were made to cater for XP. This smoothed the transition from Win9x to
XP (it may be to the point that today's transition from XP to Vista is
rougher, even though the OSs have more in common) but meant that Vista
has to face the same problems as XP.

So, this time MS are cracking the whip. If you still write apps for
Windows as if it were still "everyone loves admin" Win9x, your apps
WILL break, or they WILL piss off users with incessant UAC prompts.

If that means that the bulk of NT's security benefits finally reach
consumers, then that's prolly pain worth enduring.
Don't even get me started on SxS....It's no different than previousl DLLs
actually, except that you can no longer include them with your app. They
have to be installed on the system. But they are still dynamically linked
by default like everything has always been.

OK; I've been too long out of coding to really understand or comment
on your objections, and I suspect that would be another thread :)


--------------- ----- ---- --- -- - - -
Who is General Failure and
why is he reading my disk?
--------------- ----- ---- --- -- - - -
[/QUOTE]
 
S

Stephan Rose

cquirke said:
That would be true if you could stand on a particular abstraction
layer, i.e. that code does only what it was codes to do.

However, code defects mean that what code actually does (or can be
made to do) can be insanely different.

So one can no longer assume non-trivial code will be able to view text
without running bits of it as code. Is the code trivial enough to be
assumed safe? Maybe, in an age of ASCII, but maybe not in an age of
Unicode, RTF, HTML, sundry other "rich" formats, etc.

According to a rather interesting 6-month report from Symantec, the %
of malware using code exploits is around 20% or so; the rest use
Social Engineering or by-design functionality.

This article...

http://www.internetnews.com/security/article.php/3667201

...links to the reports themselves, in .PDF form

Well on the subject of content and security...you might enjoy this little
dialog box which I was presented with today. It occurred when opening a
text file on a NTFS drive.

http://www.somrek.net/Confirm.png

I didn't even know that dialog box existed until today!

Looks like this OS is going far further in terms of security than I am even
aware of.
No, that's hopeless.

Firstly, the Internet is all about interactions with strangers, so
"reputable" is often meaningless. Is Sony reputable?

The day I can no longer consider microsoft.com, sony.com and equivalent
business run sides reputable and reasonably safe to download content from
is the day I better unplug my network cable.
Secondly, we're moving away from entering URLs into browsers, to
finding stuff via search engines, so we know even less about where
we're going than we would do otherwise.

Thirdly, reputable places can be spoofed or infected.

Sure they can. I might also even win the lottery tomorrow even though I
don't play by pure accident!

Anything can happen...but I've found that, at least in my opinion, playing
too much "what if" doesn't solve anything either. If I go down that route I
better never turn my computer on. =)
So there is a very clear need for at least two broad levels of content
interaction; one that is safe irrespective of source, because it is
structurally limited to be so, and one that confers sufficient power
to the material that one has to have compelling reason to trust it.

Without this, safe interaction between strangers becomes impossible,
which means that much of the Internet's value is lost.

Bearing in mind the implicit "downloading of files" that applies when
browsing the web or reading email "message text"...


That worked for me through the era when resident av was too "costly"
to tolerate on systems of the time. It would prolly work for me
still, but fortunately I no longer have to put that to the test.

I consider most resident AV too costly still as of this day. Norton and
McAfee come to mind...
A core reason *why* it worked as long as it did, was because I could
tell what level of risk material would pose to the system. I could
kill active content and read any web page safely, and I could read any
email safely because the message "text" was processed purely as text.

I could see what type of files were attached to email messages, and if
I didn't like the type, I'd avoid it, no matter whether it was "from
someone I know" (a useless metric) or whether the av claimed it was
"not a virus" or not.

Very true. The only times I'll touch any attachment to an e-mail is if I
both know who it is from and why they are sending it to me.

- a co-worker sending me a utility via e-mail I need is ok.
- Receiving a random attachment I've never heard of, even if from the same
person, goes straight to the trash.
I never informally scan "the system". Specifically, I never attempt
to detect malware in active form within the infected environment.

I may do on-demand scans of subtrees in which incoming material is
concentrated, before any of it has been "opened". That makes sense...
but to scan System32 in case something's running there? Nope; if I
suspect that, then I do that "from orbit" i.e. a Bart CDR boot.

Not a bad idea.
It seems to me that a certain complexity of code will cause a certain
problem rate, including the opportunity for malware exploit. It
doesn't matter much which OS it is, especially if they're all made
from the same raw materials, such as C.

Sure, the more complex code is..or rather, the larger a code-base is..the
more places there are for exploits and bugs to hide.

And the thing is that the exploits some of these people come up with I can't
blame anyone for, not even Microsoft, as they are sometimes so utterly
strange that even I wonder HOW someone found it in the first place!

I seriously can't blame programmers in those cases for not thinking of it.
That Symantec report is interesting; it points to Oracle as the most
exploitable database, and Sun/Solaris as the OS with the longest
time-to-patch. For the first half of 2006, Red Hat was up there with
MS as the fastest to patch, then it fell back in the second half of
2006. Safari was the slowest browser to patch, even though they had
only one defect to fix; then again, 3rd-party code was involved, so
that may have slowed them down.

77% browser exploits attempts were directed at IE, so MS gets plenty
of opportunity to build patching efficiency ;-)

The thing about IE though is it also has a huge gaping security hole called
ActiveX. Even when IE introduced the permission thing for installing
ActiveX controls, malware sites simply included step-by-step instructions
on their sites as to how to install their malware disguising it as
something legitimate. Seen it many times...

The social engineering at work!
On a completely different PC, that's OK, but that's a luxury many of
us don't have. The rest of us would want the new OS to co-exist with
the old one, and some Linux distros are quite bad at that.

Very true but if someone stays with distros that are reputable and well
known problems should generally be kept to a minimum. There might be 350
some distros out there but only a handful are really in widespread use.

Personally my reccomendations to anyone would be:

- Ubuntu / Kubuntu (based on desktop preferences)
- Redhat
- Suse

Those are prolly the only 3 I'd personally ever even really bother looking
at for normal desktop use.
One thing I do agree on, is that "basic" users who always dive into a
few known apps from the desktop icons can do quite well on a different
OS, as long as the apps are similar. Hence the strategy of choosing
apps available across multiple platforms, as part of a more general
strategy of minimising cross-dependencies.

I truly do wish more developers would produce multiple platform apps,
especially considering how easy it essentially is. It is virtually no
additional work. Actually compared to dealing with the raw Win32API or MFC,
using a well written cross platform library to abstract the UI can make
things easier.
I haven't seen that, though you're right in that a Windows
installation will generally re-assert a standard MBR, thus killing any
boot manager located there. But it won't kill other partitions, as
the Ubuntu I tried would have done by duhfault.

That is what I meant, the MBR. I didn't mean windows wiping out the other
partitions. =)

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
C

cquirke (MVP Windows shell/user)

[/QUOTE]
Well on the subject of content and security...you might enjoy this little
dialog box which I was presented with today. It occurred when opening a
text file on a NTFS drive.

I didn't even know that dialog box existed until today!

"Executable text file", heh?

Yes, I really enjoyed that.




<borat> NOT!! </borat>

:)
Looks like this OS is going far further in terms of security than I am even
aware of.

I'd prefer a better explanation, e.g. "this is an executable code file
named as if it were a text file" with "More Info" button to explain
the difference and why it matters, and UI weighting to "Cancel"
The day I can no longer consider microsoft.com, sony.com and equivalent
business run sides reputable and reasonably safe to download content from
is the day I better unplug my network cable.

I hope you did that several months ago... MS have already demonstrated
they are prepared to DoS you on an automated assumption you are
breaking license terms (product activation, WGA) and Sony were caught
dropping open-use rootkits from "audio CDs", which is THE grossest
risk mis-representation I've ever come across.

I don't care who it pretends to be from - I want an OS that slams it
down to no greater risk than the material presents itself as.

"Trust me, I'm a text file!"
' Righteo, I'll view you in Notepad '
"bbbbut, I'm really an executable! Run me!"
' Fat chance hotel, mate! '
Sure they can. I might also even win the lottery tomorrow even though I
don't play by pure accident!
Anything can happen...but I've found that, at least in my opinion, playing
too much "what if" doesn't solve anything either. If I go down that route I
better never turn my computer on. =)

Some things already happen quite often.

Really, "knowing who it's from" is a poor place to bet.

You may have the most elegent and convoluted verification code in
place, but it's meaningless if it never runs (because it's bypassed)
or because all the attacker has to do is forge a couple of pixels on
the screen to fake successfully passing verification.

IOW, maybe a 128-bit DES is needed to really prove who someone is, but
when you see a yellow key or a familiar login page, all you are
looking at is a bunch of pixels that can be spoofed.

Also, "knowing who someone is" is meaningful only if you have a
template for that identity. If I say "trust me, I'm Fred Smith from
WankBlah Incorporated", that means nothing if I've never heard of Fred
Smith or WankBlah before.

If the Internet gets shrink-wrapped around consumers' bias towards Big
Names, or reverse-xenophobic trust assumptions ("trust me, I really am
an American vendor") that would be a very sad (and stupid) thing.

I don't care who you say you are, or even who you really are. If I
say "yes, show me some text", I do NOT mean "own my PC".
I consider most resident AV too costly still as of this day. Norton and
McAfee come to mind...

....as bad choice avs which will need more muscle than most. I've been
using AVG resident for years, with little performance impact; if you
prefer more effective feeware that's still speed-efficient, try NOD32.

Having said that, I treat them as a "goalie of last resort". I have
other measures, including a "safe hex" mindset, that will hopefully
stop any attacker from getting close enough to try for goal.
- a co-worker sending me a utility via e-mail I need is ok.
- Receiving a random attachment I've never heard of, even if from the same
person, goes straight to the trash.

I insist on meaningful text that refers to each attachment, else I
don't "open" any of them. The av is there in case the user really did
mean to send the file, which has been infected unknown to the user;
something that was common in the days of rampant macro viruses.
And the thing is that the exploits some of these people come up with I can't
blame anyone for, not even Microsoft, as they are sometimes so utterly
strange that even I wonder HOW someone found it in the first place!

Yup - IKWYM. When it was "duh, maybe we should check that MIME type
matches the file name extension", or "duuuuh, maybe running scripts in
HTML email signatures isn't so smart" I'd feel like kicking MS's
backside, but these days the exploits are true code failures, rather
than superficial design stupidities, and they aren't all C-style
unchecked buffers either.

I come away with the notion that one should never assume code behavior
to be bounded, even (or especially) when it's code you wrote yourself.

That has profound implications for design that I don't think have
really been taken on board yet. I'd expect MS to be the first to do
so; other platforms are either more sheltered due to fewer attacks
(MacOS, Linux) or are just too clueless (Sun) to grok it.

It's really hard to be humble, as a coder... I remember my mindset at
that time; "don't tell me anything, I'll figure it from first
principles, I'll write it myself thanks, of course it works!" :)
I seriously can't blame programmers in those cases for not thinking of it.

No. The tough lesson is that doing your very best just isn't enough
to ensure code safety, and that even after doing your best, you should
still prepare to bulkhead or amputate your code, and keep it away from
stuff that it doesn't really need to grope.

It's like a variation of "need to know". You do only what you need
to, when dealing with anything "external", and you operate under
limits that are made visible to the user.

I'd like to see a behavior rating system for programs, and an OS that
limits that app's behavior to those limits. For example, if I
download a screensaver, I'd expect it to stay offline and out of my
data set. If it were required to state upfront that id groped my
files or called home, I could reject it before use. If it got caught
claiming it didn't do those things when it did, then there'd be far
less burden of proof required to kick class-action ass.

The old "anything that happens during an user account logon session
can do anything the user can" model should be considered defunct, or
at least as useful in consumerland as water-wings on a bus.
The thing about IE though is it also has a huge gaping security hole called
ActiveX. Even when IE introduced the permission thing for installing
ActiveX controls, malware sites simply included step-by-step instructions
on their sites as to how to install their malware disguising it as
something legitimate. Seen it many times...

The problem with ActiveX warnings has always been that they tell you
nothing about WHAT control it is (I don't give a raving spit who it's
"signed" by) or what it is going to do... see this post's sig :)

Beyond that, there's maybe not that much difference between ActiveX
and Firefox plugins, save that malicious firefox plugins are currently
rare. What's more annoying is when you get the same type of alerts
for all sorts of binary behaviors, with little detailled explanation.
The social engineering at work!

SE goes with poor UI information. Whenever risk is poorly defined,
risk-relevant info is hidden, or material is allowed to mis-represent
its level of risk, "SE" thrives.

True SE would be phishing text, pure exploit would be (say) Lovesan.
Everything between is a blend of the two.
Personally my reccomendations to anyone would be:

- Ubuntu / Kubuntu (based on desktop preferences)
- Redhat
- Suse

What about what used to be Mandrake?
What's it called now?

What about PCLinuxOS?
I truly do wish more developers would produce multiple platform apps,
especially considering how easy it essentially is.

The bummer is that often such apps have UI oddities due to different
"platform traditions", or lowest-common-denominator feature sets.

For commercial-grade development, you'd need to embrace the norms of
the target platform, such as they may be - for example, familiar
consistency with keyboard / UI control navigation, the selection and
creation of installation paths, uninstallability, etc.
It is virtually no additional work.

Wow - are you a workaholic? ;-)
Actually compared to dealing with the raw Win32API or MFC,
using a well written cross platform library to abstract the UI can
make things easier.

Ahhh... OK, that makes sense. Open Office and Firefox are generally
pretty good, but both have lapses - OOo doesn't draw unit defaults
from regional settings details, only the gross region, and needs about
12 different bloody settings changes to switch out of Kings' Toenails,
and Firefox was an unpredictable bitch if you ever tried to enter a
full path for its installation (it's a bit better now).
That is what I meant, the MBR. I didn't mean windows wiping out the other
partitions. =)

Oh, OK. Funny thing, the MBR - because it's supposed to be OS-neutral
system territory, many OSs seem to assume they can write the standard
system code there. Plays hell with DDOs, but then again, it's a
self-defence pre-emptive strike, because resident DDOs (or malware
that work in the same way) can play hell with the OS ;-)

IMO, anything you want that lives in the MBR needs to be accompanied
with a user-controlled bootable re-asertion disk <g>


------------ ----- --- -- - - - -
Drugs are usually safe. Inject? (Y/n)
 
C

cquirke (MVP Windows shell/user)

IE does for downloads, OLE for data, and Windows shell for files not
registered content sniff the data to determine type as their first option.

A risk-aware OS (as any 21st century OS should be) should check all
three levels for consistency. One strike, and you're out on appeal
(user is alerted) and where it's clearly a safety risk, you're out,
period. Anyone wrapping code as "context type text" or .TXT cannot be
assumed to have made an "honest mistake".

Do airport security assume you make an "honest mistake" when you carry
a gun onto an airliner? Not bloody likely, eh?

Clues need to be got.


---------- ----- ---- --- -- - - - -
On the 'net, *everyone* can hear you scream
 
S

Stephan Rose

Well on the subject of content and security...you might enjoy this little
dialog box which I was presented with today. It occurred when opening a
text file on a NTFS drive.

I didn't even know that dialog box existed until today!

"Executable text file", heh?

Yes, I really enjoyed that.

<borat> NOT!! </borat>[/QUOTE]

Hahaha! I still have to watch that actually...=)
:)


I'd prefer a better explanation, e.g. "this is an executable code file
named as if it were a text file" with "More Info" button to explain
the difference and why it matters, and UI weighting to "Cancel"

Well the funny thing is, seeing how it is a file I created in windows XP, I
know it is actually just a plain text file.

I tested a few other text files as well, and they all came up with the same
executable warning. This does not happen when opening text files on the
native Ext3 drive though.

So I am assuming the execute part is something Windows is marking at in the
NTFS system. I wasn't even aware that there was such a thing in NTFS.

The UI *is* weighting to Cancel btw. Hence the darker black border around it
compared to the other buttons. =)

I agree though, a "More Info" button would prolly be useful...once. =) A
good newbie thing I suppose for inexperienced users.
I hope you did that several months ago... MS have already demonstrated
they are prepared to DoS you on an automated assumption you are
breaking license terms (product activation, WGA) and Sony were caught
dropping open-use rootkits from "audio CDs", which is THE grossest
risk mis-representation I've ever come across.

Well that is why:

- I don't run Vista. Their built in DoS makes that operating system 100%
completely not viable for me to use.

And you know what really literally scares me about this? MS, despite all
their so claimed security, painted a HUGE bullseye onto their OS brightly
flashing in multiple colors that can be seen from another galaxy away with
the naked eye.

I mean used to be with all previous version of Windows, as far as I know
anyway, to do any DoS attack on the system a virus / malware had to
directly attack windows and damage it in some way.

Not anymore. Now any device driver, any piece of hardware, *anything* 3rd
party that is in any way hardware related is a target. Anything that
windows monitors to determine if it is genuine is a target. Most of these
targets being monitored are from 3rd party vendors MS cannot directly
control.

All a virus or malware has to do is compromise just *one* of the MANY
components monitored by Vista and it executes the built in DoS! WONDERFUL!!

It doesn't even need to bother to try to bring windows down, windows will do
that to itself! Worst part about it being is that even if the user calls
and gets it re-activated...unless the user removes the offending piece of
malware it is just bound to happen again!

I wonder if MS will still reactivate 10 calls later?

I am just *waiting* for a wave of such attacks to occur on Vista. Not
because I dislike Vista, simply because I view it as inevitable.

As far as the Audio CD thing goes, I don't even know what open-use rookits
are. Can you elaborate on that please?
I don't care who it pretends to be from - I want an OS that slams it
down to no greater risk than the material presents itself as.

"Trust me, I'm a text file!"
' Righteo, I'll view you in Notepad '
"bbbbut, I'm really an executable! Run me!"
' Fat chance hotel, mate! '

Well essentially...linux does just that.

Even if I have a text file that is marked executable, such as a script...if
I open to view it I open to view it. Period. The content has no say in it
whatsoever.
Some things already happen quite often.

Really, "knowing who it's from" is a poor place to bet.

You may have the most elegent and convoluted verification code in
place, but it's meaningless if it never runs (because it's bypassed)
or because all the attacker has to do is forge a couple of pixels on
the screen to fake successfully passing verification.

IOW, maybe a 128-bit DES is needed to really prove who someone is, but
when you see a yellow key or a familiar login page, all you are
looking at is a bunch of pixels that can be spoofed.

Also, "knowing who someone is" is meaningful only if you have a
template for that identity. If I say "trust me, I'm Fred Smith from
WankBlah Incorporated", that means nothing if I've never heard of Fred
Smith or WankBlah before.
True.


...as bad choice avs which will need more muscle than most. I've been
using AVG resident for years, with little performance impact; if you
prefer more effective feeware that's still speed-efficient, try NOD32.

Having said that, I treat them as a "goalie of last resort". I have
other measures, including a "safe hex" mindset, that will hopefully
stop any attacker from getting close enough to try for goal.

Same here, that's why I don't really bother running them. I do an occasional
scan every few months...years....and it has a habit of turning up clean. I
figure I must be doing something right.

And if some attack does wipe out half the systems on the planet one night,
no anti virus software will be any good any way as they won't even know
about it yet.
Yup - IKWYM. When it was "duh, maybe we should check that MIME type
matches the file name extension", or "duuuuh, maybe running scripts in
HTML email signatures isn't so smart" I'd feel like kicking MS's
backside, but these days the exploits are true code failures, rather
than superficial design stupidities, and they aren't all C-style
unchecked buffers either.

I come away with the notion that one should never assume code behavior
to be bounded, even (or especially) when it's code you wrote yourself.

That has profound implications for design that I don't think have
really been taken on board yet. I'd expect MS to be the first to do
so; other platforms are either more sheltered due to fewer attacks
(MacOS, Linux) or are just too clueless (Sun) to grok it.

It's really hard to be humble, as a coder... I remember my mindset at
that time; "don't tell me anything, I'll figure it from first
principles, I'll write it myself thanks, of course it works!" :)

Hu...hu...m...l...e? How do you spell that again? ;)
Yup...most certainly have a healthy ego! haha
Tend to be able to back it up with actions, not just words, too though. =)
No. The tough lesson is that doing your very best just isn't enough
to ensure code safety, and that even after doing your best, you should
still prepare to bulkhead or amputate your code, and keep it away from
stuff that it doesn't really need to grope.

It's like a variation of "need to know". You do only what you need
to, when dealing with anything "external", and you operate under
limits that are made visible to the user.

I'd like to see a behavior rating system for programs, and an OS that
limits that app's behavior to those limits. For example, if I
download a screensaver, I'd expect it to stay offline and out of my
data set. If it were required to state upfront that id groped my
files or called home, I could reject it before use. If it got caught
claiming it didn't do those things when it did, then there'd be far
less burden of proof required to kick class-action ass.

Not a bad idea, I like it!
The old "anything that happens during an user account logon session
can do anything the user can" model should be considered defunct, or
at least as useful in consumerland as water-wings on a bus.


The problem with ActiveX warnings has always been that they tell you
nothing about WHAT control it is (I don't give a raving spit who it's
"signed" by) or what it is going to do... see this post's sig :)

Beyond that, there's maybe not that much difference between ActiveX
and Firefox plugins, save that malicious firefox plugins are currently
rare. What's more annoying is when you get the same type of alerts
for all sorts of binary behaviors, with little detailled explanation.


SE goes with poor UI information. Whenever risk is poorly defined,
risk-relevant info is hidden, or material is allowed to mis-represent
its level of risk, "SE" thrives.

True SE would be phishing text, pure exploit would be (say) Lovesan.
Everything between is a blend of the two.


What about what used to be Mandrake?
What's it called now?

Mandriva, also a popular choice I think.
What about PCLinuxOS?

Forgot about that one, again..also not a bad choice from what I am hearing.

To be honest, I personally just use Ubuntu and am happy with it. I suppose
one of these days I might play with one of the other Distributions if I
find some spare time.

I have heard good things about all of them though.
The bummer is that often such apps have UI oddities due to different
"platform traditions", or lowest-common-denominator feature sets.

For commercial-grade development, you'd need to embrace the norms of
the target platform, such as they may be - for example, familiar
consistency with keyboard / UI control navigation, the selection and
creation of installation paths, uninstallability, etc.

Actually wxWidgets does do a very nice job. It gives a very excellent set of
functionality as far as UI is concerned and I haven't found a single thing
missing that I used to use in the .Net Framework. It actaully has a couple
of features the .Net Framework *doesn't* such as its very nice docking
capability allowing the user to really customize the UI the way they want
it.

I don't quite believe in MS' way of saying "This is how we want the UI to
be...deal with it."

Another really awesome feature about wxWidgets is, and this *greatly*
reduces if not virtually eliminates UI oddities is that you can use it
without absolute coordinates.

Usually when designing say a dialog box, you place all your controls at
predefined locations until it looks good. But that creates a problem:

Different languages, themes or platforms can seriously screw that up
resulting in various problems as you so rightly mentioned.

The solution wxWidgets has to that is absolutely perfect. It has a
constraint based system that is coordinate-less (in addition to absolute
coordinates if someone would want to still use that).

So basically all I do is tell it "These are my controls, and this is how I
want them arranged." and its layout manager worries about actually placing
them based on the constraints I have given. You can also have constraints
within constraints allowing to really fine tune things.

I don't even specify a dialog box size!

The results are quite nice. No matter what the language, theme or platform
my control layouts are always 100% as they should be with no visual
oddities. Well...in some cases, I find users themes to be visual oddities
but that isn't my fault. =)
Wow - are you a workaholic? ;-)

I try to avoid work actually. =) Somehow though..it always chases me down!

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
S

Stephan Rose

cquirke said:
It's still better than "trust me, I'm an Operating System".

Maybe...marginally...it is only as good as the user input it receives.
It may also be just too difficult for current Windows users to get
their head around. I can see it now... "download this tool to allow
downloaded files to run as code without having to be converted!"

Well that's perfectly fine and reasonable to me. If a user wishes to
deliberatly install software that compromises their security then by all
means, go for it! The same user though then should not whine when security
*does* get compromised. =)
Yup, there you go. Or whatever mishap you wanted to avoid, may be
committed to the server along with your wanted saves.

Not an issue. The server maintains a history of *any* and all changes.

If I accidentally committ a bad change I can revert back to the previous
state with a few mouse clicks.
512k ADSL, as passed through the sphincter of our telecomms monopoly.
Caps are set at 1G, 3G etc. and I have the "big" 3G cap as well as the
"fast" 512k ADSL. Still, it's sooooo much better than dial-up hell,
especially as out telco charges per second for local calls to our ISPs

3G as in gigs??

Man...I don't know how you do it.

On some days I'd smack into that within hours...

So, this time MS are cracking the whip. If you still write apps for
Windows as if it were still "everyone loves admin" Win9x, your apps
WILL break, or they WILL piss off users with incessant UAC prompts.

I actually do give MS credit for that one and luckily I don't ever do that.
=)

I tend to write my apps compact and lightweight. I don't understand
applications that ship with 200 friggin DLLs! Pure insanity to me..

I rather prefer to create 1 exe capable of running from anywhere with no
need of any administrative permissions for anything.
OK; I've been too long out of coding to really understand or comment
on your objections, and I suspect that would be another thread :)

It basically just boils to one simple thing:

- Any app I compile won't run (by default VS settings) on a non-vista
computer unless the user has installed a redistributable package (which I
can never get to work).

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 
D

Dave R.

Stephan Rose said:
Hahaha! I still have to watch that actually...=)


Well the funny thing is, seeing how it is a file I created in windows
XP, I
know it is actually just a plain text file.

I tested a few other text files as well, and they all came up with the
same
executable warning. This does not happen when opening text files on
the
native Ext3 drive though.

So I am assuming the execute part is something Windows is marking at
in the
NTFS system. I wasn't even aware that there was such a thing in NTFS.
Actually, there isn't. The problem is likely the way Linux is
interpreting the Windows file attributes (hidden, system, archive and
read-only). For example, using Samba to copy or look at a file from a
Windows machine, if it has its Archive bit set it will have the owner
executable bit set on the Linux side. A Windows System file would have
the group execute turned on, and finally the Hidden attribute can be
represented by the "other" execute bit ("can be" because System and
Hidden mapping is often turned off in Samba). I've run into this
behavior on many occasions, and scratched my head until a Google search
set me right.

Regards,

Dave
 
C

cquirke (MVP Windows shell/user)

On Tue, 27 Mar 2007 01:24:43 +0200, Stephan Rose
So I am assuming the execute part is something Windows is marking at in the
NTFS system. I wasn't even aware that there was such a thing in NTFS.

I found interesting stuff here:

http://en.wikipedia.org/wiki/File_Allocation_Table

Scroll PgDn PgDn PgDn PgDn PgDn a bit, until you see this:

"(execute permissions are only used by FlexOS)"

(or just Find that in the page...)

Maybe Linux is aware of those 3rd-party attributes, and the MS OS is
not specifying them meaningfully?

A bit above that, is this...

<paste>

If a filename contains only lowercase letters, or is a combination of
a lowercase basename with an uppercase extension, or vice-versa; and
has no special characters, and fits within the 8.3 limits, a VFAT
entry is not created on Windows NT and later versions such as XP.
Instead, two bits in byte 0x0c of the directory entry are used to
indicate that the filename should be considered as entirely or
partially lowercase. Specifically, bit 4 means lowercase extension and
bit 3 lowercase basename, which allows for combinations such as
"example.TXT" or "HELLO.txt" but not "Mixed.txt". Few other operating
systems support this. This creates a backwards-compatibility problem
with older Windows versions (95, 98, ME) that see all-uppercase
filenames if this extension has been used, and therefore can change
the name of a file when it is transported, such as on a USB flash
drive. Current 2.6.x versions of Linux will recognize this extension
when reading (source: kernel 2.6.18 /fs/fat/dir.c and
fs/vfat/namei.c); the mount option shortname determines whether this
feature is used when writing

</paste>

....which I didn't know, either. Yep, this is FATxx, but may apply
also to legacy directory entry fields in NTFS, and/or the logic that
3rd-party code (e.g. Linux) may apply to these.
The UI *is* weighting to Cancel btw. Hence the darker black border around it
compared to the other buttons. =)
Guuud...


Well that is why:

- I don't run Vista. Their built in DoS makes that operating system 100%
completely not viable for me to use.

And you know what really literally scares me about this? MS, despite all
their so claimed security, painted a HUGE bullseye onto their OS brightly
flashing in multiple colors that can be seen from another galaxy away with
the naked eye.

I mean used to be with all previous version of Windows, as far as I know
anyway, to do any DoS attack on the system a virus / malware had to
directly attack windows and damage it in some way.

Not anymore. Now any device driver, any piece of hardware, *anything* 3rd
party that is in any way hardware related is a target. Anything that
windows monitors to determine if it is genuine is a target. Most of these
targets being monitored are from 3rd party vendors MS cannot directly
control.

All a virus or malware has to do is compromise just *one* of the MANY
components monitored by Vista and it executes the built in DoS! WONDERFUL!!

Yup. You have to wonder if an OS vendor is on your side when they
code a destructive payload into the OS like this.
I am just *waiting* for a wave of such attacks to occur on Vista. Not
because I dislike Vista, simply because I view it as inevitable.

It's funny how no-one lit this fuse in XP, when it comes to malware.
As far as the Audio CD thing goes, I don't even know what open-use rookits
are. Can you elaborate on that please?

A rootkit (in Windows-speak - you prolly have a clear idea of what it
means from the *NIX tradition) is malware that censors your view of
the system, e.g. so that a Dir /A doesn't show "protected" files,
Ctl+Alt+Del doesn't show "protected" processes, etc.

Any malware can incorporate this functionality, so it's as senseless
to speak of "rootkits" as it is to speak of "viruses" or "worms", the
underlying fallacy being that one malware can't do any combination of
these behaviors and more.

However, the malware industry being as mature as it is, often you get
self-contained rootkits that are used to hide the malware vendor's
real code. It's like buying a reusable code library in "normal"
development, I guess.

By "open-use", I mean that Sony did not even try to ensure that only
their DRM commercial malware would be hidden by the rootkit. Nope,
anything the a wildcard match, e.g. BLAH*.*, would also be hidden - so
it wasn't long before traditional malware coders started to use this
to hide under Sony's "protection". Due dilligence? What's that?

If a private individual did that, they'd be jailed or at the very
least they'd be legally excluded from PC coding (a la Mitnick).

The courts didn't do that to "trust me, I'm a vendor" Sony, but we
can. I will not buy or resell any Sony goods, and I will be reluctant
to support these even on a pay-per-hour basis ("It's a Sony MP3
player. Call me back when you get something that doesn't suck")
Well essentially...linux does just that.

But I need to see the info that Linux is acting on, right up there
with the filename - not in a "details" view, or "properties" click.

Yup - but MS doesn't seem to "get" this.
Same here, that's why I don't really bother running them. I do an occasional
scan every few months...years....and it has a habit of turning up clean. I
figure I must be doing something right.
And if some attack does wipe out half the systems on the planet one night,
no anti virus software will be any good any way as they won't even know
about it yet.

That's why malware's important IMO - it can blow out your capacity to
serve a client base via sudden peaks of demand.
Not a bad idea, I like it!

I'll punt it around... been a while since I last blogged.
Mandriva, also a popular choice I think.

That's it. I'm slightly less useless in Mandrake 10 than others ;-)
Forgot about that one, again..also not a bad choice from what I am hearing.

Actually wxWidgets does do a very nice job. It gives a very excellent set of
functionality as far as UI is concerned and I haven't found a single thing
missing that I used to use in the .Net Framework. It actaully has a couple
of features the .Net Framework *doesn't* such as its very nice docking
capability allowing the user to really customize the UI the way they want
it.

Ah! Can you "lock UI items"? Eudora can't, and it drives users
insane...

"I'm going back to Outbreak, Eudora's too hard to use"
' Whaaat? Hard to use? '

Then I go over, and the damn thing's lost the mailboxes, toolbar, and
the least usefull UI element is huge and glued to everything else
while other things you want to use are bouncing around like loose
teeth in a skull. What a mess... of course it's "hard to use"!

There's no "safe" grey UI space to click anymore - every smudgy
attempt to just select something breaks it off or glues it onto
something else. What I call the "leprocy UI". Don't like.
Another really awesome feature about wxWidgets is, and this *greatly*
reduces if not virtually eliminates UI oddities is that you can use it
without absolute coordinates.
Usually when designing say a dialog box, you place all your controls at
predefined locations until it looks good. But that creates a problem:
Different languages, themes or platforms can seriously screw that up
resulting in various problems as you so rightly mentioned.

Not to mention low res for optically-challenged, and non-standard
large font sizes on LCDs where anything other than the carved-in-store
res gets blurry due to "text smoothing" effects.
I don't even specify a dialog box size!

You mean they can be *resized*? MS's UI folks are still struggling
with this new-fangled Win3.yuk feature :-(


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
C

cquirke (MVP Windows shell/user)

cquirke (MVP Windows shell/user) wrote:

Maybe...marginally...it is only as good as the user input it receives.

Yep - there's a long-running debate on this in the security forums I
hang out in, to the effect that an end-user should never be asked to
make a security decision.

Of course, most of the folks there are pro-IT admins, and in that
context, I'd agree; the admin should pre-empt such matters and
pre-decide on the response, according to that org's policy.

But whether compitent or not, end-users in consumerland are
responsible for their own admin (and no, I don't believe MS should
fill those shoes... it's a "user rights" issue for me).

So yup, all you can do is provide clear info, and them bind the system
to act within the user's dictates.

I take the same line on passwords and "security", i.e. that unless the
user expresses a clear interest and intention to use such things, they
should NOT be relied on to mask off dangerous things.
Not an issue. The server maintains a history of *any* and all changes.

Ah, the journalling file system ;-)
If I accidentally committ a bad change I can revert back to the previous
state with a few mouse clicks.

How do you hedge against server-side messups?
3G as in gigs??

Yep. OTOH, a typical ISP user allowance (for both email and web site)
is a similar quanity of Megs (e.g., 2M).

My whole web site fits on one diskette ;-)
Man...I don't know how you do it.

See this post's tag :)

I'm just so happy to be out of the hell of dial-up, which is charged
by the telco per second even if ISP is flat rate.
I actually do give MS credit for that one and luckily I don't ever do that.
I rather prefer to create 1 exe capable of running from anywhere with no
need of any administrative permissions for anything.

We should talk about Bart development - sounds like you could fit
It basically just boils to one simple thing:

- Any app I compile won't run (by default VS settings) on a non-vista
computer unless the user has installed a redistributable package (which I
can never get to work).

Gutsy - and in the long-term, a good strategy IMO. You're starting
where vendors may be in 2-3 years' time, and in 2-3 years' time, the
other vendors will wish they'd taken your decision in 2007.




--------------- ----- ---- --- -- - - -
If you're happy and you know it, clunk your chains.
 
S

Stephan Rose

cquirke (MVP Windows shell/user) wrote:

, the journalling file system ;-)
How do you hedge against server-side messups?

Well the machine I am doing my work on always has an identical copy to what
is on the server. So if the server goes down, my local copy is still there.

The only thing, without backups, that I could loose is the change history in
the event of the server going dead. The current data will always be
preserved as long as I maintain my working copy on my local machine.

Once I move in a few weeks I will switch source control systems though. I
have used CVS in the past, am using SourceGear Vault now. I much prefer it
over CVS but one thing I don't like about it is the fact that it uses a
combination of a physical file storage on the hard drive and a SQL
Database. That kind of makes it a pain to back up as I can't just copy the
directory it stores the files in to preserve everything.

It has the ability to backup the SQL Data of course, but that's then just
another step I have to do.

Another thing, that didn't used to matter to me but matters now, is that I
can't integrate SourceGear vault with linux all that well. Their server
software is Windows only, though they do have a linux command-line client
but no GUI client. And of course, none of the linux IDEs support it.

I think I might give SubVersion a try next. It fixes a lot of the problems
that CVS has and does not need a SQL Database making it a whole lot easier
to backup.

Also I can integrate SubVersion equally easily into Visual Studio and my
linux environments.
Yep. OTOH, a typical ISP user allowance (for both email and web site)
is a similar quanity of Megs (e.g., 2M).

My whole web site fits on one diskette ;-)

Nice one! =)
See this post's tag :)

I like!
I'm just so happy to be out of the hell of dial-up, which is charged
by the telco per second even if ISP is flat rate.

Oh I know, when I used to FL I lived in a spot for a few years with dialup
only. It was horrible. Eventually we got satellite which was "ok" but
essentially not better than dialup. It was really no better than dialup
since it only gave high speeds for a relatively short time unti it would
throttle the connection down to dialup speeds for over 12 hours or worse.
We should talk about Bart development - sounds like you could fit

Tell me more. =)
Gutsy - and in the long-term, a good strategy IMO. You're starting
where vendors may be in 2-3 years' time, and in 2-3 years' time, the
other vendors will wish they'd taken your decision in 2007.

Referring to my cross-platform development or the visual studio thing above?

As far as VS goes, I don't run with its default settings. I static link MS'
stuff so that I avoid the problem of needing that redistributable on
non-vista systems.

--
Stephan
2003 Yamaha R6

å›ã®ã“ã¨æ€ã„出ã™æ—¥ãªã‚“ã¦ãªã„ã®ã¯
å›ã®ã“ã¨å¿˜ã‚ŒãŸã¨ããŒãªã„ã‹ã‚‰
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top