Defraggers for XP

L

Leythos

this placement WILL affect the drive long after the process, and DOES
have value beyond the defragment, mostly because the free space is
consolidated, ...that process will affect the condition of fragments
in the future

Actually, that I completely agree with - using third-party tools to
arrange objects is a great thing. I love being able to pack a drive, to
place directories at the front of the disk, etc... It's something that can
make a big difference on a highly used system.

The comment about users not having control was in the light of normal
computer use, not using third-party defraggers or drive management tools.
 
G

Guest

I agree, and I would add only that if I recall correctly, the 15% number
itself is a default value that can be changed through a registry edit.

An even better solution, for people who are running out of HD space, is to
get another hard drive. They seem to be getting both bigger and cheaper.

And again, I can go both ways on this debate. I have used Diskeeper and
PerfectDisk as well as the built-in XP defragger. I like all of them, and
have never had any problem of any kind to speak of. Maybe one is better than
the other, but I cannot tell the difference in terms of what actually matters
to me -- speed of access to files on the hard drive. Even this factor is
less important to me than it used to be back when I had older computers with
much less RAM.
 
G

Greg Hayes/Raxco Software

PD's file placement is designed to slow down the rate of re-fragmentation
and speed up future defrag passes.

PD places the $MFT where MS recommends to achieve an additional 5-10%
performance improvement with NTFS.

PD works in conjuction with XP's "prefetching" ability. Approximately every
3 days, Windows XP will perform a "partial" defrag on the files indicated in
layout.ini. The attempt is to ensure that these files are contiguous - as
MS realizes that fragmentation is THE most important thing that affects
Windows XP performance -
http://www.microsoft.com/technet/prodtechnol/winxppro/evaluate/xpperf.mspx#ECAA.

This "partial" defrag will only occur if there is contiguous free space to
"move" all of these files into. Wherever the first largest piece of free
space that will fit these files is used - regardless of where it exists on
the drive. PD simply takes over this task. In order to speed up future
defrag passes, PD places these files at the beginning of the drive.

An interesting "read" for people might be the study regarding file
system/drive performance in relation to free space consolidation written by
one of the original developers of NTFS -
http://www.raxco.com/products/perfectdisk2k/whitepapers/FreeSpace_WhitePaper.pdf


- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.

Ken Gardner said:
This type of post is exactly what prompted my original question in this
thread. Yes, PerfectDisk moves files around (e.g. pagefile, MTF zone) in
order to place them in what it terms the "optimal" position on the disk.
That's great, but what is the tangible, transparent, objectively verifiable
or measurable result for the end user like me? Does it really make my
machine go faster, and if so by how much? If I am talking about a 5-10
minute defrag (or longer) in order to win at most a few more nanoseconds in
file opening time that a human being will never notice, what's the point?

Again, I'm not criticizing PerfectDisk here, or Diskeeper for that matter.
I have used both programs for years, and as third party utility software
goes, they are top notch.

R. McCarty said:
Perfect Disk places the Pagefile in the optimal location on a drive or
partition. The only time I've seen it move it again is when an Off-line
defrag is done where the check box for "System Files" is enabled, &
MFT or other locked files require re-arrangement. So I don't agree
with your recommendation to Exclude Pagefile.Sys from the defrag.
You don't advocate setting a minimum Pagefile size below the physical
memory size and then recommend locking it's location on the drive.

[...]
 
G

Guest

Greg Hayes/Raxco Software said:
PD's file placement is designed to slow down the rate of re-fragmentation
and speed up future defrag passes.

Which it seems to do better and better over the years. Your latest version
is the fastest yet. Good job!
PD places the $MFT where MS recommends to achieve an additional 5-10%
performance improvement with NTFS.

Question: how is this speed gain measured, and how real time is actually
being saved? For example, if we are talking about a reduction in file access
time from 10 nanoseconds to nine nanoseconds, a human being isn't going to
notice the difference.
PD works in conjuction with XP's "prefetching" ability. Approximately every
3 days, Windows XP will perform a "partial" defrag on the files indicated in
layout.ini. The attempt is to ensure that these files are contiguous - as
MS realizes that fragmentation is THE most important thing that affects
Windows XP performance -

CPU and RAM are also very important, but defragmentation is the one thing
that the user can control.
This "partial" defrag will only occur if there is contiguous free space to
"move" all of these files into. Wherever the first largest piece of free
space that will fit these files is used - regardless of where it exists on
the drive. PD simply takes over this task. In order to speed up future
defrag passes, PD places these files at the beginning of the drive.

Again, what are the measurable speed differences, if any -- both in terms of
percentage and, even more important, in terms of actual time saved -- of
placing the layout.ini files at the beginning of the drive instead of
elsewhere? Are these time savings realized only when defragging the drive,
or also when the computer accesses files on the disk?
An interesting "read" for people might be the study regarding file
system/drive performance in relation to free space consolidation written by
one of the original developers of NTFS -

http://www.raxco.com/products/perfectdisk2k/whitepapers/FreeSpace_WhitePaper.pdf

No question that free space consolidation is a good thing. However, doesn't
the built-in defragger consolidate free space each time it is operated
(albeit perhaps not as well as PD, which does it on a single pass)?

Ken
 
G

Greg Hayes/Raxco Software

Ken,

"Question: how is this speed gain measured, and how real time is actually
being saved?"

Please see
http://www.microsoft.com/whdc/system/winpreinst/ntfs-preinstall.mspx


"Again, what are the measurable speed differences, if any -- both in terms
of percentage and, even more important, in terms of actual time saved -- of
placing the layout.ini files at the beginning of the drive instead of
elsewhere?"

PD's file placement - including placing of layout files at the beginning of
the drive - isn't necessarily performed to place files at a particular place
on the logical partition where it might be "fastest". PD's file placement
is primarily designed to slow down the rate of re-fragmentation of the file
system and to speed up future defrag passes.


"doesn't the built-in defragger consolidate free space each time it is
operated"

It tries :) But, just like most other defragmenters, it doesn't try hard
enough. Most defragmenters - including the built-in defragmenter
concentrate on defragmentation of files - NOT consolidation of free space.
In addition, as there are files that the built-in defragmenter will never be
able to defragment, effective free space consolidation will never occur (as
the clusters occupied by these non-defragmentable files may be scattered all
over the place). The built-in defragmenter also has problems if you get
into a low free space condition. Why do most defragmenters
recommend/require that you have at least 10-20% free space? Since they
don't consolidate existing free space very well, they require a higher total
amount of freespace. With free space consolidation, it is possible to only
need about 5% free space. You may or may not get better free space
consolidation the more times that you defragment with the built-in
defragmenter - assuming that you can afford to spend the time running defrag
over and over and over and over again...

Granted, the built-in defragmenter is better than having nothing at all :)
Will running the built-in defragmenter help the file system to perform
better than running nothing at all - yes. Does it allow the file system to
perform the best of its ability - no.

- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.
 
G

Guest

Greg:

Thanks for your response.

[...]
"Again, what are the measurable speed differences, if any -- both in terms
of percentage and, even more important, in terms of actual time saved -- of
placing the layout.ini files at the beginning of the drive instead of
elsewhere?"
PD's file placement - including placing of layout files at the beginning of
the drive - isn't necessarily performed to place files at a particular place
on the logical partition where it might be "fastest". PD's file placement
is primarily designed to slow down the rate of re-fragmentation of the file
system and to speed up future defrag passes.

Okay. This makes sense, especially in light of PD's overall file placement
strategy. Follow-up question: isn't it also the case with the built-in
defragger that, at least with repeated use (instead of the one pass approach
of PD), the more rarely modified files will eventually end up nearer to the
beginning of the disk. In other words, if you have files A, B, C, D, and E
in sequential order and file A becomes fragmented while the others do not,
doesn't the built-in defragger move file A out of the spot and then move
files B-E closer to the beginning of the disk -- only not in a single pass
the way PD does?
"doesn't the built-in defragger consolidate free space each time it is
operated"
It tries :) But, just like most other defragmenters, it doesn't try hard
enough. Most defragmenters - including the built-in defragmenter
concentrate on defragmentation of files - NOT consolidation of free space.
In addition, as there are files that the built-in defragmenter will never be
able to defragment, effective free space consolidation will never occur (as
the clusters occupied by these non-defragmentable files may be scattered all
over the place).

I can see where this state of affairs can lead in theory to a slight
performance hit, but are we talking about a hit that will be long enough for
a human being to notice or at least be able to measure?
The built-in defragmenter also has problems if you get
into a low free space condition.

This was a much bigger problem problem a few years ago, when HDs were
smaller and more expensive. I just bought a new computer with a 250GB hard
drive, and I already had a backup external USB 80GB hard drive. Free space
is never going to be a problem for me. :)

[...]
Granted, the built-in defragmenter is better than having nothing at all :)
Will running the built-in defragmenter help the file system to perform
better than running nothing at all - yes. Does it allow the file system to
perform the best of its ability - no.

I agree, and I have used PD for years with great results. I would highly
recommend it to anyone. However, my questions go more to how much
improvement in performance the average user can actually expect -- and, most
important, whether it is transparent and measurable. Nanoseconds are neither
transparent nor measurable by the average user, but seconds or even tenths of
seconds are.

Ken
 
G

Greg Hayes/Raxco Software

Ken,

Built-in defragmenter does NO file placement - other than the "partial"
defrag that is attempted approximately every 3 days. During a normal defrag
run, the layout files remain where they currently exist. Built-in
defragmenter will NOT slowly "migrate" rarely modified files to the
beginning of the partition.

As David G showed in his study on free space consolidation, lack of free
space consolidation actually results in wasted seeks - which is a
physical/mechanical activity - the slowest part of the hard drive. Will the
aggregate improvements in performance that doing a complete job of
defragmenting (files and free space) be noticable to an end user? Will the
reduction in wasted seeks be noticable to the end user? It really depends
on each person's unique environment. A 100% improvement can sound real
impressive. If you put it into context of reducing access time from 2
seconds to 1 second (a 100% improvement) - and then it really doesn't seem
like much :) However, can you think of anything else that you can EASILY do
to your computer to result in an 100% improvement in for example CPU time?

I guess what it all comes down to is perception and "pain". For majority of
users, the built-in defragmenter simply doesn't do a good enough job and/or
isn't easy enough to use - which is why 3rd party tools exist.

- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.



Ken Gardner said:
Greg:

Thanks for your response.

[...]
"Again, what are the measurable speed differences, if any -- both in terms
of percentage and, even more important, in terms of actual time saved -- of
placing the layout.ini files at the beginning of the drive instead of
elsewhere?"
PD's file placement - including placing of layout files at the beginning of
the drive - isn't necessarily performed to place files at a particular place
on the logical partition where it might be "fastest". PD's file placement
is primarily designed to slow down the rate of re-fragmentation of the file
system and to speed up future defrag passes.

Okay. This makes sense, especially in light of PD's overall file placement
strategy. Follow-up question: isn't it also the case with the built-in
defragger that, at least with repeated use (instead of the one pass approach
of PD), the more rarely modified files will eventually end up nearer to the
beginning of the disk. In other words, if you have files A, B, C, D, and E
in sequential order and file A becomes fragmented while the others do not,
doesn't the built-in defragger move file A out of the spot and then move
files B-E closer to the beginning of the disk -- only not in a single pass
the way PD does?
"doesn't the built-in defragger consolidate free space each time it is
operated"
It tries :) But, just like most other defragmenters, it doesn't try hard
enough. Most defragmenters - including the built-in defragmenter
concentrate on defragmentation of files - NOT consolidation of free space.
In addition, as there are files that the built-in defragmenter will never be
able to defragment, effective free space consolidation will never occur (as
the clusters occupied by these non-defragmentable files may be scattered all
over the place).

I can see where this state of affairs can lead in theory to a slight
performance hit, but are we talking about a hit that will be long enough for
a human being to notice or at least be able to measure?
The built-in defragmenter also has problems if you get
into a low free space condition.

This was a much bigger problem problem a few years ago, when HDs were
smaller and more expensive. I just bought a new computer with a 250GB hard
drive, and I already had a backup external USB 80GB hard drive. Free space
is never going to be a problem for me. :)

[...]
Granted, the built-in defragmenter is better than having nothing at all :)
Will running the built-in defragmenter help the file system to perform
better than running nothing at all - yes. Does it allow the file system to
perform the best of its ability - no.

I agree, and I have used PD for years with great results. I would highly
recommend it to anyone. However, my questions go more to how much
improvement in performance the average user can actually expect -- and, most
important, whether it is transparent and measurable. Nanoseconds are neither
transparent nor measurable by the average user, but seconds or even tenths of
seconds are.

Ken
 
G

Guest

:

Again, thanks for your very interesting and informative responses, which I
have found is typical of Raxco over the years.
Built-in defragmenter does NO file placement - other than the "partial"
defrag that is attempted approximately every 3 days. During a normal defrag
run, the layout files remain where they currently exist. Built-in
defragmenter will NOT slowly "migrate" rarely modified files to the
beginning of the partition.

Okay. I had assumed otherwise. My bad. :)
As David G showed in his study on free space consolidation, lack of free
space consolidation actually results in wasted seeks - which is a
physical/mechanical activity - the slowest part of the hard drive.

Exactly. Incidentally, would you recommend that a PD use agressive free
space consolidation, on top of what PD aread does in "smart placement" mode?
I do know it takes considerably longer, at least on the first
defragmentation run.
Will the aggregate improvements in performance that doing a complete job of
defragmenting (files and free space) be noticable to an end user? Will the
reduction in wasted seeks be noticable to the end user? It really depends
on each person's unique environment. A 100% improvement can sound real
impressive. If you put it into context of reducing access time from 2
seconds to 1 second (a 100% improvement) - and then it really doesn't seem
like much :) However, can you think of anything else that you can EASILY do
to your computer to result in an 100% improvement in for example CPU time?

I would regard a second's improvement as sufficiently transparent to be
worth using the the better defragger, especially if it is a file that I use
regularly. Patience was never one of my virtues. :) My rule of thumb is
that if I can quantify it and notice it, it's transparent enough to be worth
my while. I cannot count a nanosecond, but I can certainly count a full
second.
I guess what it all comes down to is perception and "pain". For majority of
users, the built-in defragmenter simply doesn't do a good enough job and/or
isn't easy enough to use - which is why 3rd party tools exist.

I find the built-in defragger easy to use (no harder than PD, certainly),
but I'm not sure that it does a good enough job. A separate question is
whether PD and your competitor Diskeeper do a sufficiently better job to
justify the expense (which actually isn't very much as good third party
software goes). I think they probably do, but I prefer your product.
Diskeeper essentially does what the built-in defragger does, but
automatically rather than manually (plus you can do offline defrags that the
built-in one doesn't do). Yours can be scheduled to run on a schedule
(although I ususally run it manually), but I think you guys present a much
stronger case than Diskeeper on why your defragger will actually leave you
with a more responsive system -- especially the free space consolidation
issue.

Ken
- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.



Ken Gardner said:
Greg:

Thanks for your response.

[...]
"Again, what are the measurable speed differences, if any -- both in terms
of percentage and, even more important, in terms of actual time saved -- of
placing the layout.ini files at the beginning of the drive instead of
elsewhere?"
PD's file placement - including placing of layout files at the beginning of
the drive - isn't necessarily performed to place files at a particular place
on the logical partition where it might be "fastest". PD's file placement
is primarily designed to slow down the rate of re-fragmentation of the file
system and to speed up future defrag passes.

Okay. This makes sense, especially in light of PD's overall file placement
strategy. Follow-up question: isn't it also the case with the built-in
defragger that, at least with repeated use (instead of the one pass approach
of PD), the more rarely modified files will eventually end up nearer to the
beginning of the disk. In other words, if you have files A, B, C, D, and E
in sequential order and file A becomes fragmented while the others do not,
doesn't the built-in defragger move file A out of the spot and then move
files B-E closer to the beginning of the disk -- only not in a single pass
the way PD does?
"doesn't the built-in defragger consolidate free space each time it is
operated"
It tries :) But, just like most other defragmenters, it doesn't try hard
enough. Most defragmenters - including the built-in defragmenter
concentrate on defragmentation of files - NOT consolidation of free space.
In addition, as there are files that the built-in defragmenter will never be
able to defragment, effective free space consolidation will never occur (as
the clusters occupied by these non-defragmentable files may be scattered all
over the place).

I can see where this state of affairs can lead in theory to a slight
performance hit, but are we talking about a hit that will be long enough for
a human being to notice or at least be able to measure?
The built-in defragmenter also has problems if you get
into a low free space condition.

This was a much bigger problem problem a few years ago, when HDs were
smaller and more expensive. I just bought a new computer with a 250GB hard
drive, and I already had a backup external USB 80GB hard drive. Free space
is never going to be a problem for me. :)

[...]
Granted, the built-in defragmenter is better than having nothing at all :)
Will running the built-in defragmenter help the file system to perform
better than running nothing at all - yes. Does it allow the file system to
perform the best of its ability - no.

I agree, and I have used PD for years with great results. I would highly
recommend it to anyone. However, my questions go more to how much
improvement in performance the average user can actually expect -- and, most
important, whether it is transparent and measurable. Nanoseconds are neither
transparent nor measurable by the average user, but seconds or even tenths of
seconds are.

Ken
 
C

Curmudgeon

"Being how actual defragmentation is going to result in a very limited
amount
of performance boost to begin with, the degree to which any third party
tool
is going to result in faster performance than the XP tool is going to
be so
small as to be effectively naught -- in any case, too small to be
measurable. "

AGREED.

I've had XP installed since it was initially released (over 3 years),
and never tried a 3rd party defragger until I read this thread tonight.

Installled the trial version of PD... it found 1% fragmentation.

BIG deal.

Just un-installed it.
 
N

N_Salicini

May I intervene with my small problem? Well, at least compared to all the
serious stuff that is discussed here. I just used XP defragmeneter on my
notebook's drive... and everything went nuts as a result: audio sound
distorted, mouse distorted, all slowed down dramatically! Anyone ever
encountered anything like this? Would extra defragmenting help?
 
G

Gerry

It is doubtful that your problem is connected with running Disk
Defragmenter.

Please post copies of all Error and Warning Reports appearing in
the System and Application logs in Event Viewer for the last boot. No
Information Reports or Duplicates please. Indicate which also appear in
a previous boot.

You can access Event Viewer by selecting Start, Control Panel,
Administrative Tools, and Event Viewer.

A tip for posting copies of Error Reports! Run Event Viewer and double
click on the error you want to copy. In the window, which appears is a
button resembling two pages. Click the button and close Event
Viewer.Now start your message (email) and do a paste into the body of
the message. Make sure this is the first paste after exiting from
Event Viewer.

Are there any yellow question marks in Device Manager? Right click on
the My Computer icon on your Desktop and select Properties,
Hardware,Device Manager. If yes what is the Device Error code?

--



Hope this helps.

Gerry
~~~~
FCA
Stourport, England
Enquire, plan and execute
~~~~~~~~~~~~~~~~~~~
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top