Which defragmenter to use?

  • Thread starter Thread starter Frank Martin
  • Start date Start date
RAID!!! said:
Yea, this is the stupidest "feature" I have
seen in a defragmenter yet.
What's the point of defragging the HDD to
increase disk performance if you
are going to slow down your system with a
resource hungry stupid feature
like this. Turn that "feature" off pronto.
Oh, and expect to get spammed by
Diskeeper frequently from now on with their
BS claims of how it will turn
your PC into a speed demon.

Still, it is faster than before. But I will
test it for 2 weeks before deciding.
Access2003 works faster now.
 
Tell me your performance test method and I will go test it out myself. If I
am wrong I will eat my shorts.

Three simple tests:

1) Copy 10GB file from one drive to another, or one partition to
another, or same partition to same partition, on a badly fragmented
drive.

2) On a fragmented drive, where the SQL database files are fragmented,
as is typical on a workstation (and server), perform a series of
extensive queries.

3) On a MS Access file, perform a set of queries.

Now, defragment the files, run test again, see if there is a difference.

Restore machine from ghost image, so that it's the same level of
fragmentation.

Now use Diskeeper, test, look at times/results per second over the same
long periods.
 
Yea, this is the stupidest "feature" I have seen in a defragmenter yet.
What's the point of defragging the HDD to increase disk performance if you
are going to slow down your system with a resource hungry stupid feature
like this. Turn that "feature" off pronto. Oh, and expect to get spammed by
Diskeeper frequently from now on with their BS claims of how it will turn
your PC into a speed demon.

And you don't really appear to understand the product, as anyone that's
used it knows that it throttles back to inactivity when the system is in
use, only doing the defrag during slack times.
 
Raxco ( maker of Perfect Disk ) has a set of tools for testing
defragmentation. The first is called Scrambler and FragGen.
They are used to either generate new fragmented files or to
fragment the existing content on a partition/volume. Then you
use a tool called FileAccessTimer to "accurately" measure the
time to read file(s). You then use the Defrag tool of your choice
and retest with FileAccessTimer.

I've used Perfect Disk for a long time ( include VAX/VMS ).

OK, thanks I will look into it.
 
Three simple tests:

1) Copy 10GB file from one drive to another, or one partition to
another, or same partition to same partition, on a badly fragmented
drive.

2) On a fragmented drive, where the SQL database files are fragmented,
as is typical on a workstation (and server), perform a series of
extensive queries.

3) On a MS Access file, perform a set of queries.

Now, defragment the files, run test again, see if there is a difference.

Restore machine from ghost image, so that it's the same level of
fragmentation.

Now use Diskeeper, test, look at times/results per second over the same
long periods.

Thanks, but those Raxco tools look like a simpler test so will try that
method out.
 
And you don't really appear to understand the product, as anyone that's
used it knows that it throttles back to inactivity when the system is in
use, only doing the defrag during slack times.

I've used it and it slowed my PC down. I don't autoload any unneeded shite
on my PC. It's a useless feature. If you think your HDD needs to be
defragmented continuously all the time to perform well then you are out to
lunch. I defrag maybe once every two weeks or once a month, anything beyond
that is ridiculous and you need to stop playing program manager with your
PC and use it as the tool it was designed to be.
 
Raxco ( maker of Perfect Disk ) has a set of tools for testing
defragmentation.


Hey, I checked the Raxco website and can't find those tools there. Where do
I get them?
 
I just checked and the tools aren't listed. I sent an email to a
contact at Raxco on their availability. When I get additional info
I'll post to the answer.
 
Thanks, but those Raxco tools look like a simpler test so will try that
method out.

But, what you have to understand is that they "Simulate" a condition,
and while it may simulate a quality condition, where fragmentation is
really screwed up, my experience is in MANY real-world situations, where
workstations and servers have been run for 6+ months under heavy use, or
in the case of some, where they've run for years, without being
defragged, or where they've been defragged only using Windows defrag
tools.

We actually had a web/database application that failed to return results
in such a long delay that the pages expired before returning the data,
while a windows defrag permitted the results to be displayed, a DK
defrag provided a 7 second improvement in the basic windows defrag -
having done nother other than adding DK and then letting it do a boot-
time defrag on the server (database).
 
Leythos said:
If I am wrong I will eat my shorts.

Bwahahahahaha, I would love to see that! Leythos pull a Bart Simpson?!?
LOL! If you only said that everytime you were wrong before...
 
Bwahahahahaha, I would love to see that! Leythos pull a Bart Simpson?!?
LOL! If you only said that everytime you were wrong before...

Dude, get a real Usenet reader, that was RAID that said he would eat his
shorts, not me.

Properly quoted with the part that RAID wrote, my text removed:
 
But, what you have to understand is that they "Simulate" a condition,
and while it may simulate a quality condition, where fragmentation is
really screwed up, my experience is in MANY real-world situations, where
workstations and servers have been run for 6+ months under heavy use, or
in the case of some, where they've run for years, without being
defragged, or where they've been defragged only using Windows defrag
tools.

We actually had a web/database application that failed to return results
in such a long delay that the pages expired before returning the data,
while a windows defrag permitted the results to be displayed, a DK
defrag provided a 7 second improvement in the basic windows defrag -
having done nother other than adding DK and then letting it do a boot-
time defrag on the server (database).

I asked about defragging at HardOCP and the below reply basically mirrors
what I have been told on the storage group. Bottom line is that defragging
is highly overrated.

"That depends entirely what your application is trying to do. If you're
making large sequential accesses to files, it's entirely reasonable to
expect (from a programmer's point of view) that those accesses happen
sequentially on disk, and you don't pay the price for seeking. But if you
touch two files, you have to immediately give up on the possibility that
they're next to each other on disk. So if you're doing large sequential
transfers and the disk is fragmented, you'll get worse performance than you
otherwise could. But if you're dealing with lots of small files, it doesn't
matter all that much whether the disk is fragmented or not - you're going
to have to seek anyways.

Some defragmenters will put all the files in a given directory next to each
other; this does help. The question is whether it's worth the time you
spend defragmenting to save time in your application. If your application
is a game and load times are important to you, then it might be. But for
many things, spending many hours defragging isn't worth the improvement of
minutes per run. So I agree with him - but with conditions."
 
I just checked and the tools aren't listed. I sent an email to a
contact at Raxco on their availability. When I get additional info
I'll post to the answer.
OK, thanks.
 
I asked about defragging at HardOCP and the below reply basically mirrors
what I have been told on the storage group. Bottom line is that defragging
is highly overrated.

"That depends entirely what your application is trying to do. If you're
making large sequential accesses to files, it's entirely reasonable to
expect (from a programmer's point of view) that those accesses happen
sequentially on disk, and you don't pay the price for seeking. But if you
touch two files, you have to immediately give up on the possibility that
they're next to each other on disk. So if you're doing large sequential
transfers and the disk is fragmented, you'll get worse performance than you
otherwise could. But if you're dealing with lots of small files, it doesn't
matter all that much whether the disk is fragmented or not - you're going
to have to seek anyways.

Some defragmenters will put all the files in a given directory next to each
other; this does help. The question is whether it's worth the time you
spend defragmenting to save time in your application. If your application
is a game and load times are important to you, then it might be. But for
many things, spending many hours defragging isn't worth the improvement of
minutes per run. So I agree with him - but with conditions."

The problem is that most people don't deal with just one size file, they
deal with many sizes of files. Sure, if you could put the test into a
single group we could make it show the results we want, but, take a
server where users share an array, where there is fragmentation, and all
users benefit from a full defrag.

Take a workstation with a single user, one that downloads MP3, Videos,
plays online games, etc... Many different sizes of files, let them use
the machine for a year, then defrag it, fully, and pack the drive, bet
you tell you they notice a difference, everyone we've done this for has
noticed and commented (even when we didn't tell them we were doing it).

So, without conditions, generally, everyone benefits from a full defrag
with pack, few people won't benefit from it.
 
The problem is that most people don't deal with just one size file, they
deal with many sizes of files. Sure, if you could put the test into a
single group we could make it show the results we want, but, take a
server where users share an array, where there is fragmentation, and all
users benefit from a full defrag.

Take a workstation with a single user, one that downloads MP3, Videos,
plays online games, etc... Many different sizes of files, let them use
the machine for a year, then defrag it, fully, and pack the drive, bet
you tell you they notice a difference, everyone we've done this for has
noticed and commented (even when we didn't tell them we were doing it).

So, without conditions, generally, everyone benefits from a full defrag
with pack, few people won't benefit from it.

OK, if it's so beneficial then why doesn't Linux have a defragmenting tool?
Surely fragmentation would occur on any file system?
 
Back
Top