I finally trashed my 14 year old Pentium II computer--it ran Windows2000 fine, but never could run L

T

The Natural Philosopher

RayLopez99 said:
OK I will and thanks again. So then it stands to reason the
'snapshots', binary data that can only be read by VMs, must be huge if
you have for example Office, Visual Studio and some music files say
from iPod stored in your VM--I'm guessing your snapshot would be at
least 10 GB and probably double that.

RL
But you wouldnt be using Office if you had a windows machine on a linux
host.

Nor would you be storing your music or video files inside the windows
partition.

Or indeed any data at all.
Beyond that insisted upon by windows programs.

My snapshots are about 3GB each. 19GB in total (8 of em) The main C:
drive takes up 3.4GB.

But its no big deal to back that up either.

It's VIRTUAL size is 28GB, but that's only a limit beyond which it can't
grow.

You miss the whole point of desktop virtualisation: its there not to
run a windows distro as you would normally, but to run as little windows
as is necessary to launch the programs that need windows to work.

Everything else is on a linux drive mapped into the windows space as a
'networked drive' and that's where all the DATA lives. So windows can
crash away and its safe, and only the minimal amount of windows is there
to launch the very few windows programs you cant do without.

In my case windows is ONLY there to run Graphics and CAD. And
occasionally IE6 if I need to check that a web page renders with the
worst browser ever made.

Since all data is shared, and when it is running windows is simply a
mouse click away, there is no overhead in e.g. saving a file in windows
- say a graphics image - and importing it straight into a linux
application. Even cut and paste works reasonably well.

I don't expect you to understand the mentality that says 'I dont want
windows, but I need it for three or four things' and thereby constructs
a system whereby it is easy to use that way.

YOUR mentality is 'windows needs to be superior in every way, let's
construct ways to demonstrate linux s a crock of shit by picking
something Linux cant do (as well).
 
R

RayLopez99

But you wouldnt be using Office if you had a windows machine on a linux
host.

Nor would you be storing your music or video files inside the windows
partition.

Or indeed any data at all.
Beyond that insisted upon by windows programs.

My snapshots are about 3GB each. 19GB in total (8 of em) The main C:
drive takes up 3.4GB.

But its no big deal to back that up either.

It's VIRTUAL size is 28GB, but that's only a limit beyond which it can't
grow.

You miss the whole point of  desktop virtualisation: its there not to
run a windows distro as you would normally, but to run as little windows
as is necessary to launch the programs that need windows to work.

Everything else is on a linux drive mapped into the windows space as a
'networked drive' and that's where all the DATA lives. So windows can
crash away and its safe, and only the minimal amount of windows is there
to launch the very few windows programs you cant do without.

In my case windows is ONLY there to run Graphics and CAD. And
occasionally IE6 if I need to check that a web page renders with the
worst browser ever made.

Since all data is shared, and when it is running windows is simply a
mouse click away, there is no overhead in e.g. saving a file in windows
- say a graphics image - and importing it straight into a linux
application. Even cut and paste works reasonably well.

I don't expect you to understand the mentality that says 'I dont want
windows, but I need it for three or four things' and thereby constructs
a system whereby it is easy to use that way.

YOUR mentality is 'windows needs to be  superior in every way, let's
construct ways to demonstrate linux s a crock of shit by picking
something Linux cant do (as well).

I was actually learning something from you post then I noticed it was
from you. So it must be wrong.

But if what you say is true there must be some data conversion of data
from Linux format to Windows format--I think Linux uses a different
file format entirely. Not Big Endian either, which is at the byte
level, but at the hard drive level.

RL
 
R

RayLopez99

On 21/06/2011 14:33, RayLopez99 wrote:


I'm not convinced you are capable of boiling an egg, never mind figuring
out which way up it should go.

I believe the issue that you are confused about is line-endings in text
files.  Unix has always used LF (character 10) as a line ending, while
Macs have always used CR (character 13).  DOS, for some incomprehensible
reason, used two characters - CR + LF.  This absurdity has continued,
and is the "standard" in Windows.

However, any decent program (on Linux, Macs or Windows) that deals with
text files will happily work with files with any choice of line endings,
or at least convert them when reading in a file.  Of course, notepad on
Windows does not count as "decent".

The line endings used by convention for text files is irrelevant for
non-text files, which are identical on all systems.

And of course this has nothing to do with storing files from a Windows
system on a Linux filesystem, either via the VirtualBox mapped drive
described here, or using a samba file server under Linux.  There you are
talking about a windows program storing files on a Linux filesystem - of
course Linux will return exactly the same bytes when the file is read as
it got when the file was written.  It wouldn't matter if it stored the
file in hieroglyphics on a stone tablet, as long as the data returned is
the same as the data received.

Well thanks for the information, not the insults. Indeed line endings
are different on different OSes and the fact that in your opinion
Notepad, a workhorse ASCII file program that probably had a team of
programmers work on it, cannot work properly to handle the line
endings is proof that it is not trivial to convert between the two
systems. I also note you skirt around the issue of how the "data [is]
returned' from the "stone tablet". That is also a ton of programming
to get it done right. I have a better solution: with virtual
machines, stick with the same version of OS as your actual, real OS.
For example, if you run Windows 7, your VM should be limited to
Windows XP and other flavors of Windows. That will eliminate any
potential problems.

RL

RL
 
E

Ezekiel

David Brown said:
I'm not convinced you are capable of boiling an egg, never mind figuring
out which way up it should go.


I believe the issue that you are confused about is line-endings in text
files. Unix has always used LF (character 10) as a line ending, while
Macs have always used CR (character 13). DOS, for some incomprehensible
reason, used two characters - CR + LF. This absurdity has continued, and
is the "standard" in Windows.

This "absurdity" is also the standard line ending for DEC TOPS-10, RT-11,
CP/M, MP/M, DOS, Atari TOS, OS/2, Microsoft Windows, Symbian OS and Palm OS.
Not to mention that CR+LF is also the standard delimiter for *most* internet
protocols including mail, NNTP, HTTP, etc, etc.

However, any decent program (on Linux, Macs or Windows) that deals with
text files will happily work with files with any choice of line endings,
or at least convert them when reading in a file. Of course, notepad on
Windows does not count as "decent".

Most text editors handle it just fine. Notepad isn't really a text editor.
It's a Windows 'edit control' that's placed into a window with a menu.
The line endings used by convention for text files is irrelevant for
non-text files, which are identical on all systems.


And of course this has nothing to do with storing files from a Windows
system on a Linux filesystem, either via the VirtualBox mapped drive
described here, or using a samba file server under Linux. There you are
talking about a windows program storing files on a Linux filesystem - of
course Linux will return exactly the same bytes when the file is read as
it got when the file was written. It wouldn't matter if it stored the
file in hieroglyphics on a stone tablet, as long as the data returned is
the same as the data received.

What you basically said. You will read out exactly what was written into the
file. Nothing more, nothing less.
 
T

The Natural Philosopher

RayLopez99 said:
I was actually learning something from you post then I noticed it was
from you. So it must be wrong.

But if what you say is true there must be some data conversion of data
from Linux format to Windows format--I think Linux uses a different
file format entirely. Not Big Endian either, which is at the byte
level, but at the hard drive level.

Shows how little you know.

Of course the files are readable by both. Or you would not be ablet to
receive an image on a windows browser off an apache server. which most
web servers are.


File formats are defined by standards, not by operating systems, unless
its windows, but windows APPPLICATIONS have to work cross platform, so
no worries there.
 
T

The Natural Philosopher

RayLopez99 said:
On 21/06/2011 14:33, RayLopez99 wrote:
I'm not convinced you are capable of boiling an egg, never mind figuring
out which way up it should go.

I believe the issue that you are confused about is line-endings in text
files. Unix has always used LF (character 10) as a line ending, while
Macs have always used CR (character 13). DOS, for some incomprehensible
reason, used two characters - CR + LF. This absurdity has continued,
and is the "standard" in Windows.

However, any decent program (on Linux, Macs or Windows) that deals with
text files will happily work with files with any choice of line endings,
or at least convert them when reading in a file. Of course, notepad on
Windows does not count as "decent".

The line endings used by convention for text files is irrelevant for
non-text files, which are identical on all systems.

And of course this has nothing to do with storing files from a Windows
system on a Linux filesystem, either via the VirtualBox mapped drive
described here, or using a samba file server under Linux. There you are
talking about a windows program storing files on a Linux filesystem - of
course Linux will return exactly the same bytes when the file is read as
it got when the file was written. It wouldn't matter if it stored the
file in hieroglyphics on a stone tablet, as long as the data returned is
the same as the data received.

Well thanks for the information, not the insults. Indeed line endings
are different on different OSes and the fact that in your opinion
Notepad, a workhorse ASCII file program that probably had a team of
programmers work on it, cannot work properly to handle the line
endings is proof that it is not trivial to convert between the two
systems. I also note you skirt around the issue of how the "data [is]
returned' from the "stone tablet". That is also a ton of programming
to get it done right. I have a better solution: with virtual
machines, stick with the same version of OS as your actual, real OS.
For example, if you run Windows 7, your VM should be limited to
Windows XP and other flavors of Windows. That will eliminate any
potential problems.

What on earth would be the point of that?
 
R

RayLopez99

I thought you were a millionaire programmer.  Surely you can see that it
is trivial to support alternative line endings in a simple text editor?

Not trivial at all. You don't code do you? You'd know that extended
ASCII support is not trivial. You need to reference the correct
library and library function in your assembly but it's not a five
minute operation unless you've done it before.
Notepad is not a "workhorse ASCII program", and not even MS would need a
significant "team of programmers" to work on it.  An almost identical
program could be written in an afternoon by any reasonably competent
Windows programmer - including support for different line endings.  The
only reason Notepad does not support Unix and Mac line endings is
because MS prefers it to be hard to work with anything that does not
live entirely in the Windows and MS world.

Ironically, the one advanced feature that Notepad does have compared to
most editor programs, is support for UTF-16 character encoding - which
is /much/ more effort than implementing alternative line endings.

It's the same thing actually. You're just too dumb to see it. The
same family of problems even though newline and CR are in standard
ASCII.
I also note you skirt around the issue of how the "data [is]
returned' from the "stone tablet".  That is also a ton of programming
to get it done right.

There has been plenty of programming work in the development of Linux, yes.
 I have a better solution:  with virtual
machines, stick with the same version of OS as your actual, real OS.
For example, if you run Windows 7, your VM should be limited to
Windows XP and other flavors of Windows.  That will eliminate any
potential problems.

There is /no/ problem - potential or real.  It is only in your
imagination, and your determination to find some fault whenever Linux is
mentioned.

Look, the world runs on Linux servers - they are found everywhere.  If
there were problems storing files from a Windows desktop on a Linux
machine, someone would have noticed by now.

Nope. Read this article and when you lern something get back to me:
http://www.linux.com/archive/feature/141378

And keep in mind they are talking about characters not binary data.
Binary data and reading /writing from the stream in Linux and Windows
has its own problems.

RL

Linux tools to convert file formats
By Federico Kereki on July 22, 2008 (4:00:00 PM)
Share Print Comments

Life would be a lot easier if we could live in a Linux-only world and
if applications never required data from other sources. However, the
need to get data from Windows, MS-DOS, or old Macintosh systems is all
too common. This kind of import process requires some conversions to
solve file format differences; otherwise, it would be impossible to
share data, or file contents would be imported incorrectly. The
easiest way to transfer data between systems is by using plain text
files or common formats like comma-separated value (CSV) files.
However, converting such files from Windows or Mac OS results in
formatting differences for the newline characters and character
encoding. This article explains why we have these problems and shows
ways to solve them.
 
T

The Natural Philosopher

RayLopez99 said:
Not trivial at all. You don't code do you?

yes.

You'd know that extended
ASCII support is not trivial. You need to reference the correct
library and library function in your assembly but it's not a five
minute operation unless you've done it before.

er, no.

Linux does all that for me :)
Notepad is not a "workhorse ASCII program", and not even MS would need a
significant "team of programmers" to work on it. An almost identical
program could be written in an afternoon by any reasonably competent
Windows programmer - including support for different line endings. The
only reason Notepad does not support Unix and Mac line endings is
because MS prefers it to be hard to work with anything that does not
live entirely in the Windows and MS world.

Ironically, the one advanced feature that Notepad does have compared to
most editor programs, is support for UTF-16 character encoding - which
is /much/ more effort than implementing alternative line endings.

It's the same thing actually. You're just too dumb to see it. The
same family of problems even though newline and CR are in standard
ASCII.
I also note you skirt around the issue of how the "data [is]
returned' from the "stone tablet". That is also a ton of programming
to get it done right.
There has been plenty of programming work in the development of Linux, yes.
I have a better solution: with virtual
machines, stick with the same version of OS as your actual, real OS.
For example, if you run Windows 7, your VM should be limited to
Windows XP and other flavors of Windows. That will eliminate any
potential problems.
There is /no/ problem - potential or real. It is only in your
imagination, and your determination to find some fault whenever Linux is
mentioned.

Look, the world runs on Linux servers - they are found everywhere. If
there were problems storing files from a Windows desktop on a Linux
machine, someone would have noticed by now.

Nope. Read this article and when you lern something get back to me:
http://www.linux.com/archive/feature/141378

All i learnt from that is that you must be pretty darn stupid to think
that's been a problem in the last 15 years.

And, to be strictly honest I knew that already.

And keep in mind they are talking about characters not binary data.
Binary data and reading /writing from the stream in Linux and Windows
has its own problems.
Characters are binary data.
 
R

RayLopez99

Again, you have no concept of what you are saying yourself, never mind
what anyone else is saying.

Nope. Personal attack noted.

As I said, there are no problems saving files from Windows to a Linux
system and reading them back again.  Nothing in that article indicates
anything different.  Any time you save a file on a system and read it
back, you will get the same file with the same line endings and same
encodings as you started with.

Nope. If you convert a file from Linux to Windows and back again 10M
times, I bet you'd have problems somewhere. that was actually
demonstrated on Excel by a professor a decade or more ago, and I'm
sure a high number of I/O conversions would yield a similar error.

If you have a file saved with one line ending, and try to open it with a
program that only works with a different line ending, then you will have
problems.  This is totally independent of the OS and applies equally to
Linux and Windows programs.  

Nope. We're talking file systems. Handled by the OS at a low level.

Similarly, if you have a file with one type
of character encoding, and you view it using a different character
encoding, you will have problems.  Again, this is independent of the OS..

Nope. Embedded in the OS. Try harder.
You can demonstrate this by writing some text in Notepad and saving it
in utf-16 format.  Then open a command prompt and "type" that same file
- it will look a mess.  All from within Windows.

Nope. You are confusing ASCII with extended ASCII.
And all of which has nothing to do with line endings, and nothing to do
with the fact that you can store Windows files on a Linux system.

Nope, see above.
Mind you, I should thank you for that link - I was not aware of the
recode utility.  

You are ignorant of many things, young grasshopper.
I have rarely needed to do such character encoding
conversion, and have normally just used a text editor (/not/ notepad).
But it's nice to know of a command-line utility that does the job too.
Nope.


There are no such problems outside of your fantasies.
Nope.


Keep in mind that people do this stuff all the time.  Like many
companies, my company use Linux servers to store all their data, and
mostly Windows on the desktop (with some Linux desktops too).  I also
use Windows and Linux virtual machines on both Windows and Linux hosts.
  I believe I would have noticed if there were problems transporting
either text or binary files between the systems.

Nope. You are pretty ignorant, a low level grunt who is simply an
expendable (once Azure gains market share) IT hack.

RL
 
T

The Natural Philosopher

Its been such fun..watching the unfolding of a mind so steeped in
ignorance it almost beggars belief that its is capable of actually
typing a Usenet post.

And it led to the thought.. what sort of minimum intelligence level is
needed to actually run a computer?


I have this sort of basic tenet...

People with money and no sense buy apple.
People with less money and no sense get sold windows.
People with a lot of sense install Linux.

BUT how long will the last in times of global recession and cutbacks in
public spending on employing people with no sense ?

Plus we have 'cloud computing' whereby the complete turkey can buy some
'thing' with a screen, splodge his pudgy fingers over it, press buttons
and download 'apps' and get his account debited (assuming there is any
money in it, which seems increasingly unlikely) without being aware of
who what or why any of it works, or what operating systems the component
parts are in fact running (mostly some form of *nix, but who cares?).

This might be the end of the personal computer as we know it, and
Windows too...

....so where does that leave the corporate desktop? How long before a
bank of hypervised Linux servers are running the corporate 'cloud' and
the equivalent of a Wyse 50 or VT100 moderately thick client, with no
disk at all, is actually doing the job of displaying the pretty pictures
and accepting the keystrokes, with any software it needs held centrally
as a downloadable Java 'app'?

It's an attractive thought.

That pushes the traditional PC - a machine with RAM and Disks, and
general purpose capability , into a niche..the sort of niche where
performance is really significant, like perhaps specialist typesetting
or graphic design, only.

We can but hope so, because it may finally and forever put an end to teh
Lopez 'walks through ignorance' as it becomes apparent even to him, that
he was a footsoldier in a war that was already over long before he even
learned what it was about.
 
R

RayLopez99

It's not an attack - it's a statement of fact (assuming, of course, that
you are actually as ignorant as you appear to be, rather than just
pretending).

Given the large number of ad hominem, inappropriate, and incorrect
insults you have made against me (and most others here, including those
that try to help you), I assume that this is just the way you like to
converse.


Two wrongs don't make a right, or two Wongs don't make a White, to
paraphrase the Abercrombie & Fitch T-shirt from about 10 years ago.

Again, you are talking from complete and total ignorance.  There is no
such thing as "convert a file from Linux to Windows" (or "I/O
conversions").  And even if there were, then the "conversion" would be
either correct or incorrect - repeating the conversion 10M times would
not change that.

Perhaps what this mythical professor was doing is converting an Excel
file to an OpenOffice file (or StarOffice or gnumeric file, if it was a
decade ago).  In the conversion of a file format like that, there may be
some artefacts of the conversion process that build up as you convert
back and forth.  As a simple (but made-up) example, suppose that when
converting the expression "A1 + A2" from one format to the other, an
extra pair of brackets is added "(A1 + A2)".  The conversion would still
be correct, but if repeated 10M times you would get "((...(A1 +
A2)...))" - eventually reaching the limit of the number of nested
parenthesis the program (Excel, StarOffice, gnumeric, etc.) can work with..

But, as usual, this mythical professor with his mythical problem is
totally unrelated to your imaginary issues with storing Windows files on
Linux.

Nope. The conversion was a hardware defect with a 386 math co-
processor--so perhaps it was more like 20 years ago. Anyway, the
point being mistakes happen.

It's hard to know what /you/ are talking about, since you don't
understand about files, file systems, or operating systems.

Filesystems store files.  You give them a file - a bunch of bytes - and
later on, when you ask for it back, you get that same bunch of bytes.
It's quite simple.  The OS doesn't look inside the file or make any
changes to it underway.  It doesn't care if it is "text" or "binary", or
what character encoding was used, or what line ending was used, or what
program in what OS made the file.  It's just a bunch of bytes.

Nope. There's a difference between binary and text, even though both
are stored as bytes. If you were a programmer you'd know about this
(supported in the library files).

/What/ is "embedded in the OS"?  Line-endings?  Character encodings?

No time to teach you--I'm not an instructor.
I use text files with different line endings and different character
encodings on both Windows and Linux - it is not "embedded in the OS".
By default, Unix programs use LF line endings and Windows programs use
CR+LF.  That also applies to files used by the OS - Windows wants CR+LF
line endings for .bat and .ini files, for example, and Linux wants LF
line endings for configuration files.  But either system will happily
store files with either line ending, and programs such as text editors
will happily work with either on either OS.



I am not confusing anything, but it turns out that "type" also supports
litte-endian utf-16 format.  That was a bad guess from me - I didn't
have any windows systems available conveniently for testing.  Now, try
again from Notepad but safe the files in UTF-8 or "Unicode big-endian"
(which is utf-16 big endian) format.  You don't even have to use
non-ASCII characters - "Hello world" is enough.


Your admission that you were wrong ("bad guess") is gratefully
acknowledged. You are a bigger man than I thought.
It's /really/ difficult trying to break through your wilful ignorance.

Nope. Not difficult at all.
I googled "Azure market share".  Here are some top links:

<http://www.vcritical.com/2010/07/microsoft-seeks-to-stem-azure-exodus...>

<http://www.zdnet.com/blog/microsoft/microsofts-weakest-cloud-link-the...>

<http://searchitchannel.techtarget.com/news/1350979/Microsoft-Azure-fa...>

If you've got a link that shows Azure having a significant and/or
growing market share, you could post it here.

The first link was the most relevant. It is a year old, but
interesting. It's speculation, but not unsound. I repeat the salient
paragraph below. Question for you, young grasshopper, from the me,
the master, as to whether resistance to Azure is from IT professionals
afraid of adopting it and losing their jobs?

RL

Customers leaving Azure in droves?

I recently acquired an email from Microsoft, desperately seeking to
address an apparent exodus of customers from Windows Azure:

My team is working to understand why some of our valued customers
have stopped using their Windows Azure platform subscription(s). I am
emailing today to ask you to complete a short survey on why you have
stopped using our service.

We will use this information to improve our platform and address
issues that may have led you to stop using your subscription. We take
your feedback seriously and it will lead to direct action.

Whatever the reason for this sudden shift may be, the most succinct
take on the announcement goes to Om Malik, who concluded in this
GigaOM article:

Microsoft, it seems, is merely following what is en vogue these
days.

Interesting strategy shift: If customers won’t come to your
proprietary platform, see if you can trap them inside a box right in
their own own datacenter. Cloud computing at its finest.
 
R

RayLopez99

I /am/ a programmer - though you don't need to be a programmer to
understand that files are a bunch of bytes, and that to the file system,
there is no difference between a text file and a binary file.

You need to study the differences between stream adapters like
StreamReader / Writer and BinaryReader/Writer and XMLReader / Writer,
then Backing Store Streams like FileStream, MemoryStream,
NetworkStream in C#. Though it's true they all ultimately deal with
raw bytes (and in that respect you are correct) it's not true that
'text is binary' as you so naively imply.
You seem to think I am an "IT grunt" - I am also the entire IT
department for two companies (about 80 people altogether).  One of the
reasons I use Linux on the servers is that it all just works, so that I
don't have to spend time on it - the "IT" side of my job takes only
about 10-15% of my time.

And if programming takes 10-15% of your time, what do you spend the
other 70-80% of your time on? Flaming here? You're fired.
And your complete inability to understand the rest of what I wrote, as
well as your unwillingness to try a quick experiment yourself, is also
acknowledged.  You are an even smaller man than I thought.

Nice symmetry. You must have done well in english composition.
Have you learned anything at all through all your posts in these
newsgroups?  I have seen no evidence of it.

You are the evidence, troll bait.
The only people interested in Azure are the most ardent MS sycophants,
and there are fewer of these every day.

No. I think the truth is the Azure pricing scheme is a little hard to
figure out, but it's a concept that will work with time. Resistance
is probably from IT professionals who feel threatened by Azure. I
don't think the Amazon API, which is nothing but a poorly documented
series of web service calls, can compete with the Azure platform.

RL
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top