Defragmenting

  • Thread starter Thread starter Steve
  • Start date Start date
Edwin said:
"performance gain can be significant"
Obviously.


you cleary have different experiences with this... I have never
(scientifically) seen any gain...

I don't believe you.
but bottom line, the tool is there, I just hope people who use it will have
a good backup strategy as well

Back up is always a computer user's best friend.

Alias
 
Charlie

fair enough, point taken !

I was sharing my personal experiences in my posts
"isn't that what all of us do here ?"

;-)
 
"performance gain can be significant"

you cleary have different experiences with this... I have never
(scientifically) seen any gain...

I manage some 2000 workstations and while we don't defrag them on a
regular basis, I can assure you that many of them benefit from a defrag
- in fact, users have emailed us asking what we did to their machines to
make them "so much faster".

Each person's use will vary, as will their use of the drive and
fragmentation levels, but, while you may not be able to detect it, just
the concept alone, if you understand it, should give you a clear
understanding of performance gain.

On a fragmented system with large files or many little files and some
large files, you will see a increase in performance, related to the
Drive functions, even loading applications will benefit if they were
installed while the drive was seriously fragmented.
but bottom line, the tool is there, I just hope people who use it will have
a good backup strategy as well

I would suggest that ANYONE doing MAINTENANCE should do it under good
conditions. I've seen many systems crash (power) during a defrag with no
issue at all, rebooted just fine, but I've also seen a few fail.

Before you start telling people that there is no benefit, you might want
to listen to the people that have experience in this area - people that
have been working with computers since the 70's or 80's.
 
Leythos and all,

It was not my intention to upset anyone.

Clearly my posts are interpeted differently than I intended, I should have
worded my posts differently

For that I apologise

Rgds,
Edwin.
 
Leythos and all,

It was not my intention to upset anyone.

Clearly my posts are interpeted differently than I intended, I should have
worded my posts differently

For that I apologise

No one appears to be upset, but I write in a blunt fashion, since you
can't really worry about emotions in plain text on Usenet.

I've read articles where PC Mag and others tested and found no benefit
to defrag, but, they didn't do real-world testing, their methods were
flawed, and they just plain don't really have a clue.

Any busy machine will benefit from a defrag, how much and how often is
different in almost every case.
 
Leythos,

I really thought you were upset, maybe that is because of your use of
capitals, and the quoting the 1970's/80's.

Personally I am extremely interested in performance gain by defragmenting.
I am looking for a scientific way to proof perfromance gain, and execute
these tests in my labs.

If anyone has a good and proven method to qualify and quantify "performance
gain" please let me know.

I understand that many have "good results" and "no complaints" and even
"compliments from users" after defragging, but that is not what I interested
in, nor is it something what I dispute.
I am sure users "are happy" after defragging, I am interested in an actual
number, how to get to that number, and from what baseline measurement, and
with what tools.

Rgds,
Edwin.
 
I certainly wasn't offended Edwin, I was simply pointing out that a normally
careful user will probably not see serious fragmentation for a long time.
Another issue with fragmentation is that it's not all "Bad". Much depends on
what is being read and / or written. Sometimes fragmentation on the
graphical display most of these programs have looks bad but actually is
unavoidable and really doesn't produce any symptoms. You are certainly
correct that you can run a defragmenter for an hour on something like this
and no difference will be seen. Not only that but this kind of fragmentation
will soon be back, therefore your "What's the point" question is a perfectly
valid one. You can blame things like email files, temporary internet files
and other miscellaneous stuff for this. It makes no difference to speed
because the operations that produce it are slow anyway. When the disk is
getting full and has been used for both reading and writing, with (say) many
small files being deleted and replaced with bigger files it will start to
make a difference. I'd say that most people probably defragment too soon to
ever notice a difference, or else they don't defragment at all and just
never see the difference in the opposite sense :)

There's a big area where it doesn't matter so there are two ways to look at
it. Defragment often and run for a short time (As Diskeeper can do
automatically) or defragment rarely taking several hours. In the end the
time taken will be the same so you have to decide which suits you best - you
do not have control over power outages so from the risk point of view it is
a coin toss. Personally I have switched from the first option to the second
because it just suits me at this time to do that.

Charlie



Edwin vMierlo said:
Leythos and all,

It was not my intention to upset anyone.

Clearly my posts are interpeted differently than I intended, I should have
worded my posts differently

For that I apologise

Rgds,
Edwin.
 
Edwin said:
Leythos,

I really thought you were upset, maybe that is because of your use of
capitals, and the quoting the 1970's/80's.

Personally I am extremely interested in performance gain by defragmenting.
I am looking for a scientific way to proof perfromance gain, and execute
these tests in my labs.

If anyone has a good and proven method to qualify and quantify "performance
gain" please let me know.

I understand that many have "good results" and "no complaints" and even
"compliments from users" after defragging, but that is not what I interested
in, nor is it something what I dispute.
I am sure users "are happy" after defragging, I am interested in an actual
number, how to get to that number, and from what baseline measurement, and
with what tools.

Rgds,
Edwin.

Why waste your time with a test? Just do it and SFU.

Heh.

Alias
 
also, if you take into account how NTFS "maps" its filesystem onto a
physical disk geometry, with differences in actual blocks vs sectors, then
take into account the IO read / write vs sequential / random. It is academic
on how your seek/read/write times are influenced by defrag.

In ideal situation, with a specific condition you can predict the effects of
a defrag, example

large fragmented file
sequential read IO (no writes at all)

versus

large contigues file
sequential read IO (no writes at all)

this would probably be a huge benifit
unfortunately this is far from reality.

I do find it a interesting topic, and will keep experimenting with this
(don't worry, test machine, and even this one... I have a backup)



Charlie Tame said:
I certainly wasn't offended Edwin, I was simply pointing out that a normally
careful user will probably not see serious fragmentation for a long time.
Another issue with fragmentation is that it's not all "Bad". Much depends on
what is being read and / or written. Sometimes fragmentation on the
graphical display most of these programs have looks bad but actually is
unavoidable and really doesn't produce any symptoms. You are certainly
correct that you can run a defragmenter for an hour on something like this
and no difference will be seen. Not only that but this kind of fragmentation
will soon be back, therefore your "What's the point" question is a perfectly
valid one. You can blame things like email files, temporary internet files
and other miscellaneous stuff for this. It makes no difference to speed
because the operations that produce it are slow anyway. When the disk is
getting full and has been used for both reading and writing, with (say) many
small files being deleted and replaced with bigger files it will start to
make a difference. I'd say that most people probably defragment too soon to
ever notice a difference, or else they don't defragment at all and just
never see the difference in the opposite sense :)

There's a big area where it doesn't matter so there are two ways to look at
it. Defragment often and run for a short time (As Diskeeper can do
automatically) or defragment rarely taking several hours. In the end the
time taken will be the same so you have to decide which suits you best - you
do not have control over power outages so from the risk point of view it is
a coin toss. Personally I have switched from the first option to the second
because it just suits me at this time to do that.

Charlie
 
I understand that many have "good results" and "no complaints" and even
"compliments from users" after defragging, but that is not what I interested
in, nor is it something what I dispute.
I am sure users "are happy" after defragging, I am interested in an actual
number, how to get to that number, and from what baseline measurement, and
with what tools.

Take a fragmented real workstation, ghost an image of it, then
test/defrag it, then you can restore it to a test drive and it will be
exactly the same as your initial test drive.

We look at servers for performance testing in defrag, where we can run
SQL Queries and others that have known testing methods, and then the
same tests are repeated.... I've seen as much as a 50% decrease in
transaction times after only doing a defrag.

If you want to test, making an Image that can restore that exact image
is all that's needed - so that you can duplicate your testing results.
 
Please don't backwards quote.
http://ursine.ca/Top_Posting

Edwin said:
but bottom line, the tool is there, I just hope people who use it will
have a good backup strategy as well

Though in general, if you don't have a working backup strategy that you have
tested to make sure you can restore from it, your data is in danger whether
or not you defrag.
 
Leythos said:
Take a fragmented real workstation, ghost an image of it, then
test/defrag it, then you can restore it to a test drive and it will be
exactly the same as your initial test drive.

Have you taken a look to make sure ghost actually restores a bit-for-bit
copy of the drive and not just the files? I know imaging the drive with dd
under Linux will make a perfect image of the drive for sure and would
result in the same fragmentation on the disk when restored.
 
Please don't backwards quote.
http://ursine.ca/Top_Posting

Charlie said:
I certainly wasn't offended Edwin, I was simply pointing out that a
normally careful user will probably not see serious fragmentation for a
long time.

That begs the question, with the amount of spam and trojan horses your
average Windows user picks up, are users normally careful? I would hazard
to guess the answer is "no," and I think a lot of programmers have the same
opinion given the hacker¹ maxim "Never attribute to malice what can be
readily attributed to stupidity."
Another issue with fragmentation is that it's not all "Bad".
Much depends on what is being read and / or written. Sometimes
fragmentation on the graphical display most of these programs have looks
bad but actually is unavoidable and really doesn't produce any symptoms.

Not only that, but modern filesystems will automatically pick the optimal
location for files based on average access time to keep fragmentation from
being an issue to start with (hence why defragging is only something to
worry about for systems that still use NTFS, VFAT or Apple HFS filesystems
as opposed to one developed in the last 10-15 years). The idea that not
all fragmentation is bad fragmentation was something of a surprise to me
when I first tried Linux 10 years ago this month.
There's a big area where it doesn't matter so there are two ways to look
at it. Defragment often and run for a short time (As Diskeeper can do
automatically) or defragment rarely taking several hours. In the end the
time taken will be the same so you have to decide which suits you best -
you do not have control over power outages so from the risk point of view
it is a coin toss. Personally I have switched from the first option to the
second because it just suits me at this time to do that.

On my laptop, I used to let it fly when I went to sleep about once a month.
When I'm out consulting or contracting and a client wants me to try out new
software that's going to be deployed on desktops, I have them give me a
license so I can try it out myself as well, so now I have Diskeeper running
on it to defrag when the screen saver kicks on (just a blank screen saver
with the same interval as the monitor shut-off, and Diskeeper automatically
stops if the screen saver ends or the laptop switches to battery power)...

....These days, though, most of my work is with Linux, so now about the only
reason I even keep Windows on my laptop is for Auran Trainz² (and it'll
probably go to a disk image on my backup drive in favor of Linux once
there's a Linux port for Trainz).


¹ http://ursine.ca/Hacker
² http://en.wikipedia.org/wiki/Trainz (Wikipedia's entry for this game is
actually better than Auran's official site for it...)
 
If you wish to test, just run some of your standard everyday functions on a
few of your Computers (including boot up) and keep timer records by specific
functions. Then defrag those machines and exactly repeat these previous
functions (down to the keystroke and mouse click if possible) and time them
again. Check the timer records one to the other to identify speed
(efficiency) increase (if any). Time saved is money earned!! All the other
big words you could use are just blab.
Gene K
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Back
Top