Slowness countemeasures - Is it bogus to defrag one's paging file? MFT?

J

John B

Consider a system that is a couple of years old. I've read of unhappy W2K
and XP users whose systems have slowed down gradually over the months and
years.

Diskeeper argues that it can do what Win2K's native defragger cannot do:
defrag the paging file and the master file table. I can see virtue in
defragging any such MFT.

However, is the paging file a red herring?
That is, does the OS use the paging file that was built long ago?
What if the computer was rebooted? Does the OS have any awareness of the
paging file contents from previous boots?

Now in the case of a Win2K Pro computer that is left on for weeks at a time,
I suppose there is a stronger argument for defragging the paging file.

It probably helps to make the paging file much larger, so fragmentation
isn't so critical, as well.

http://www.execsoft.co.uk/html/diskeeper/dkv.htm

What happens if I just blow the paging file away, and reconstruct it? I
suppose this is easier said than done, if one's computer has only one OS
installed.

Thanks for any advice,
John
 
A

Alex Nichol

John said:
Consider a system that is a couple of years old. I've read of unhappy W2K
and XP users whose systems have slowed down gradually over the months and
years.

Diskeeper argues that it can do what Win2K's native defragger cannot do:
defrag the paging file and the master file table. I can see virtue in
defragging any such MFT.

However, is the paging file a red herring?
That is, does the OS use the paging file that was built long ago?
What if the computer was rebooted? Does the OS have any awareness of the
paging file contents from previous boots?

The page file is restarted every boot. And on modern size RAM there is
really not much traffic on it - what there is is essentially random,
except when dumping a new item into it, so fragmentation of it is not
that much of a consideration. One thing though that the inbuilt one
(and DiskKeeper) do *not* do is consolidate free space, thus leaving
that in fragments, and making it much more likely that new files will
start off fragmented. Executive Software say that this is deliberate
and doing it is not needed. I disagree, which is why I have come to
prefer the Perfect disk package from raxco instead (having paid for
both)


What is a much bigger cause of slowing is a gradual accretion of things
running in the background - just about everything I install seems to
want its 'control' or 'instant launch'; not to mention spyware coming
in. DO a regular scan fro spyware and check in MSconfig.exe that the
things starting in Startup are things you really *need*
 
R

Ricardo M. Urbano - W2K/NT4 MVP

John said:
Consider a system that is a couple of years old. I've read of unhappy W2K
and XP users whose systems have slowed down gradually over the months and
years.

Diskeeper argues that it can do what Win2K's native defragger cannot do:
defrag the paging file and the master file table. I can see virtue in
defragging any such MFT.

However, is the paging file a red herring?
That is, does the OS use the paging file that was built long ago?
What if the computer was rebooted? Does the OS have any awareness of the
paging file contents from previous boots?

Now in the case of a Win2K Pro computer that is left on for weeks at a time,
I suppose there is a stronger argument for defragging the paging file.

It probably helps to make the paging file much larger, so fragmentation
isn't so critical, as well.

http://www.execsoft.co.uk/html/diskeeper/dkv.htm

What happens if I just blow the paging file away, and reconstruct it? I
suppose this is easier said than done, if one's computer has only one OS
installed.

Thanks for any advice,
John

The page file can become a problem if the minimum and maximum sizes
aren't set to the same value. As Windows adjusts the pagefile up and
down, it will fragment more and more. A trick that we used to use in
the NT4 days to create a minimally fragmented pagefile was to fully
defrag everything else, then set the pagefile min and max to 0 (which
may be difficult at best, and disastrous at worst, depending on how old
your installation is and what loads during bootup), reboot, then set the
min and max to the same desired value and reboot again.

Either way, if the installation is old and you are seeing degraded
performance, the MFT and pagefile probably have more impact on
performance than anything else and you'd get more than your money's
worth by installing a real defragger that can defrag these 2 things.

hth
 
J

John B

To
Ricardo M. Urbano
Microsoft Windows 2000/NT MVP

Many thanks!

I've never heard of setting the pagefile minimum and maximum to be equal,
let alone zero, however. I'll have to think about that.
I read long ago that if a computer has two hard drives, then it is
advantageous to place the OS on one drive, and the paging file on another.
I intend to implement that approach; conversely, I might just leave
everything on the new drive that I will install today, as it will be
considerably faster than the old one...

I'll probably benchmark them both and decide.

Again, thank you for the affirmation regarding using the superior
defragmentation tool on MFT and pagefile. Diskeeper portends to do this
automatically, periodically, if I understood the product's description
correctly. I hope to benchmark, both with a quantitative tool and by "seat
of the pants." Will report anything interesting back here.


John
 
A

Alex Nichol

Ricardo said:
The page file can become a problem if the minimum and maximum sizes
aren't set to the same value. As Windows adjusts the pagefile up and
down, it will fragment more and more.

WRONG

that was advice that became out of date at Win98. Windows will increase
the file should there be an exceptional condition, but will not reduce
it (unless it can do so by simply dropping off empty space at the top.
Provided that initial size is enough to cover normal need, there is no
need for more. But you need head room for exceptional conditions, and
for other reasons - see my page at www.aumha.org/win5/a/xpvm.htm
 
R

R. McCarty

This is another case of "Urban Legends" as they apply to Windows.
Just yesterday, I was called on the Clean the Prefetch folder tweak.
Somebody needs to create a Web-site that debunks these carryover
myths about Windows. Specifically those "tweaks" for 9X that just
don't work with the Windows XP. It's one thing to offer advice on
postings, but then to get a reply back "Sorry, You're Wrong" makes
you wish you hadn't taken the time to reply.
 
G

Greg Hayes/Raxco Software

John,

Due to how Windows accesses the pagefile, unless the pagefile is highly
fragmented, you may not notice any performance degradation. As Ricardo
mentioned, the best thing is to set the min/max to the same value.

Regarding the $MFT, if it is severely fragmented, you will notice an issue.
With mild fragmentation, you likely won't. What defragmentation of the
pagefile, hibernate file and NTFS metadata (includes the $MFT - only 1
defragmenter does this all of the metadata) DOES provide is the ability for
a defragmenter to perform a better job of free space consolidation -
something that provides more of a benefit - both short term and long term.

If you place the $MFT about 1/3 of the way into the drive, you can also gain
an additional 5-10% performance improvement with NTFS (according to
Microsoft). This also is unique to 1 defragmenter and I'll let you guess
which one it is :)

- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.
 
V

Vaughn McMillan - Executive Software

Again, thank you for the affirmation regarding using the superior
defragmentation tool on MFT and pagefile. Diskeeper portends to do this
automatically, periodically, if I understood the product's description
correctly. I hope to benchmark, both with a quantitative tool and by "seat
of the pants." Will report anything interesting back here.

Just to clarify, Diskeeper doesn't automatically run the boot-time
process to defrag the paging file and MFT. The boot-time operation is
typically only necessary once in a while, and since a reboot is
necessary, we opted to not make it happen automatically. You can
specify when you want it to happen, but it doesn't run unless you
specifically tell it to.

The defragmentation of the rest of your files *is* handled
automatically if you choose for it to be.

I hope this helps -

Vaughn McMillan
Executive Software
 
A

Alex Nichol

John said:
I've never heard of setting the pagefile minimum and maximum to be equal,
let alone zero, however. I'll have to think about that.
I read long ago that if a computer has two hard drives, then it is
advantageous to place the OS on one drive, and the paging file on another.
I intend to implement that approach; conversely, I might just leave
everything on the new drive that I will install today, as it will be
considerably faster than the old one...

Setting 'no page file' on the drive concerned, reboot to defrag, then
starting up again is a way to ensure that the PF starts off on a clean
partition, and so is not fragmented. But you must use a defragmenter
that consolidates free space (neither the inbuilt one nor DiskKeeper do
this), and be sure to put it back in action after. Better to move it to
some second partition, just while you do the defrag, leaving always a
minimal one (init 2 max 50) on C:, which will not then actually come
into use.

Putting it in a second physical drive is in principle a good idea - on a
second partition on the only drive is not. Read more at
www.aumha.org/win5/a/xpvm.htm
 
A

Alex Nichol

Greg said:
Due to how Windows accesses the pagefile, unless the pagefile is highly
fragmented, you may not notice any performance degradation. As Ricardo
mentioned, the best thing is to set the min/max to the same value.

PLEASE don't, unless you have free disk space going unused, way in
excess of the normal need for the file. For the reasons see my page at
www.aumha.org/win5/a/xpvm.htm
 
G

Greg Hayes/Raxco Software

John,

Are you running Diskeeper V8 or an earlier version. Earlier versions have
Frag Guard - which can automatically reboot and perform a boot time defrag
(if configured to - it is disabled by default). V8 no longer appears to
have Frag Guard.

- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.
 
D

Daniel L. Belton

Alex said:
John B wrote:




Setting 'no page file' on the drive concerned, reboot to defrag, then
starting up again is a way to ensure that the PF starts off on a clean
partition, and so is not fragmented. But you must use a defragmenter
that consolidates free space (neither the inbuilt one nor DiskKeeper do
this), and be sure to put it back in action after. Better to move it to
some second partition, just while you do the defrag, leaving always a
minimal one (init 2 max 50) on C:, which will not then actually come
into use.

Putting it in a second physical drive is in principle a good idea - on a
second partition on the only drive is not. Read more at
www.aumha.org/win5/a/xpvm.htm
Actually, Diskeeper pro version 8 will consolidate the freespace in it's
"set it and forget it" mode

it's easiet to set WinXP to delete the pagefile at shutdown and then it
will re-create it when the system boots. This will defragment the
pagefile very nicely.
 
J

John B

Thanks for your reply.
How does one establish the location of $MFT, in Windows 2000 Pro?
I entered "locate $MFT" as an exact phrase, in Google, and got a link to the
IRS! No help there!
 
J

John B

Thanks for your reply. Per my promise, I am posting what I can about the
results of my endeavor to speed up one W2K Pro computer, that had slowed
considerably over three actual years of use. The computer exists at a
remote location that I visit every few months.
I was not able to install my benchmarking software, XMark, because the
computer has only a dial-up link to the internet. XMark requires a big
download in order to function. So no quantitative measurements were
possible.
I used Diskeeper to defrag everything, even though analysis indicated only
modest fragmentation. I had used the native defragger in September, when I
last visited the site.
I replaced a 20 GB ATA-100 hard drive with a 80 GB ATA-100 hard drive. The
old drive wasn't anywhere near filling up; it had only used perhaps 4 GB of
its space. Both drives are Western Digital, my favorite brand. Why did I
change drives, then, you ask?
The new drive has a 8MB buffer, to the old one's 2 MB, for starters.
The new drive is warranted for 3 years, starting now. Most drives are
warranted for only a year. The old drive is 3 years old. Though the
computer is a Dell with an excellent warranty, about to expire, it is
prudent to install a highly reliable hard drive into this workhorse
computer, presently.
I set the paging file to be huge, though I didn't set min=max. I set
min=3GB, and max about 5GB.
I used the Western Digital-supplied utility to transfer the contents on the
old hard drive, to the new hard drive. The process took perhaps an hour,
but executed flawlessly and effortlessly.
I removed the old hard drive from the system entirely.

The system seemed to operate briskly upon completion. My customer reports
great satisfaction at the speed of her system, now. I can't argue with
success, even if I can't "quantify" it!
I intend to register Diskeeper at this site, within a few weeks. The
concept of scheduled defragmentation appeals to me. I'll probably set the
min=max on the paging file on my next visit to the site. It slipped my mind
to do so upon this last visit.
Cheers,
John
 
A

Alex Nichol

John said:
How does one establish the location of $MFT, in Windows 2000 Pro?
I entered "locate $MFT" as an exact phrase, in Google, and got a link to the
IRS! No help there!

If you run the defragmenter in question (and it has a 'try out' you can
download) its Analysis of a drive will show you
 
G

Greg Hayes/Raxco Software

-- Actually, Diskeeper pro version 8 will consolidate the freespace in it's
"set it and forget it" mode --

Possibly. But how effective/effecient is it? If you actually look at how
Diskeepers "Improved Free Space" mode works, you will notice that it only
occurs slowly over a period of time. Each time that it runs, it may reduce
the number of free space holes by a few. Unfortunately, you have no way of
knowing how many times Diskeeper will have to run in order to consolidate
the free space - and it isn't something that you can run manually to "force"
it to happen.

- Greg/Raxco Software
Microsoft MVP - Windows File System

Disclaimer: I work for Raxco Software, the maker of PerfectDisk - a
commercial defrag utility, as a systems engineer in the support department.

Want to email me? Delete ntloader.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top