arachnid said:
Only if you need 3D graphics. The graphics card has to be emulated in
software and that makes 3D "acceleration" extremely slow. For normal
2D desktop use, and provided you install the vmware tools, it's nearly as
fast as native.
I know 3D graphics will be slow in the virtualized OS no matter what the
host OS is. I'm not complaining about that. VMWare just runs plain slow
on Windows as the host OS, when compared to the same hardware running
Debian.
RPM works well under Fedora but it does eventually lead to a dead-end when
you want to work with source.
Or you come across an RPM with a file-based dependency (WTF? Just depend on
the package, for chrissake!), or you come across an RPM that gratuitously
uses a different package naming convention (I remember trying Red Hat 5.2
back in the day and banging my head on the wall because some packages
depended on glibc5, others depended on libc5, yet others depended on
gnulibc5 more still depended on individual files. All are the same
package, but does RPM recognize them as such? NO! RPM's brain damage
largely comes from developers of those RPMs, but this situation has only
worsened over time (now that we have RPMs for each RPM based distribution
that tend to be mutually incompatible, even though they're all RPM). Even
Debian and every fork of Debian that doesn't run straight from the CD can
agree on package names and version numbering, and the retarded idea of file
dependencies is entirely nonexistent. I hypothesize based on the mistakes
learned from RPM that Debian deliberately made it so dpkg and apt make it
it much harder to create a package that sends users to dependency hell.
I like Apt a whole lot more. But a beginner's not going to notice the
difference, and by the time he does he'll have developed the skills to
understand the issues and switch to whatever distro best suits his needs.
I was a beginner when I learned the difference between dpkg and RPM in the
first month of using Linux. The former works reliably, the latter works
approximately whenever the hell it feels like.
)
FreeBSD's packaging system uses both source and precompiled binaries.
So does Gentoo, but that doesn't stop portage from being a waste of time for
most people. It's great for hobbyists, and great for professionals who
need to make some obscure tweak that wouldn't be appropriate to include in
a more general distribution, but that's about it. Gentoo and the BSD's aim
to fill a niche market, and most people aren't in it.
)
On a new installation I install the small, quickly-compiled stuff from
source and install the big things like KDE from binaries. That gets me up
and running ASAP, then later when I don't need the machine for a day I
compile KDE from source. Compiling for my specific hardware makes it
noticeably faster than the precompiled binaries.
I'd like to see that quantified. Usually when Gentoo folks quantify their
time savings, it's usually non-existent in reality except for multimedia
stuff and the kernel. And for the multimedia stuff and the kernel, most
distros do include sub-arch specific binaries for each arch it supports.
Keep in mind that when binaries are distributed, they have to be compiled
for the lowest-common-denominator machine rather than taking full
advantage of modern CPU designs.
Depends on the distro. Most RPM-based distros require a Pentium or PII
because they compile for i586 or i686 sub-arches. Debian compiles for i386
(which really is the LCD), and compiles sub-arch specific binaries for the
stuff that there is an actual, quantifiable gain in performance on.
I don't just compile for speed. I do it to confirm that I have working
source code for everything on my system, that I have the toolchains to
build everything from source, and because government backdoors are a whole
lot less likely to exist in publicly published source code than in
precompiled binaries.
Well, you're not going to find backdoors by compiling code, you need to comb
through it line-by-line before compiling it if that's your reason for going
from source. Otherwise, the backdoor could still be in the source, but you
wouldn't ever know about it without looking at the entire codebase
yourself, *and* you waste time compiling it yourself.
You have absolutely no basis for making that statement. I know because of
this:
If you had ever actually used Ubuntu and knew the slightest thing about
it, you'd know that its repositories are comparable to, and even
interchangeable with, Debian's.
I have used Ubuntu recently at my last job, I'm aware of what Ubuntu is and
does. My point is if you want to stick with a straight-up Ubuntu system
(since neither the Debian lists nor Canonical are generally willing to
provide help you if you mix and match), your software selection is limited.
As for which is better for beginners, take a look at the popularity chart
on distrowatch.com. Ubuntu is downloaded 3.5 times more often than Debian.
If you add up all the Ubuntu variations (just differences in Window
Managers), it's nearly 5 times more popular than Debian.
So? MySpace's sheer existence shows that 75 million people on this planet
can be wrong, and before that, AOL proved that 30 million Americans can be
wrong.
Ubuntu is stealing the show because it's *significantly* more
user-friendly than Debian, which has traditionally turned a cold shoulder
to beginners with no desire to become tech gurus.
No, Ubuntu is stealing the show because Canonical has an advertising budget.