On Mon, 13 Dec 2004 19:05:46 -0000, "Gerry Cornell"
Chris
Hi!
You can place files which constantly fragment in their own partitions.
The page file and temporary internet files are examples.
Yes, but because these are so often accessed, I take a reverse
approach; I keep them on C: in the interests of less head travel, but
then I get most everything off C: and keep C: small, so that no matter
how bad it gets, max head travel is still a small % of HD's "length".
Also makes for faster C: defrags
In practice I never defragment the page file
In Win9x, I left swap file to be managed automatically by the OS. If
I felt there was a need to stop swap from fragmenting or resizing, I'd
set a minimum that was above the max I thought I'd ever need, but
leave the maximum "open" in case I'd thought wrong.
In XP, I see the OS defaults to setting a page file maximum size, and
stupidly chooses these sizes based on RAM size. As if a 128M PC was
going to get by with less swap space than a 512M PC - yeah, right.
So my practice on XP is to set min=max=512M, which freezes whatever
fragmentation you already have, into aspic forever. To reduce that
frag, I set pagefile to None, and dfrag until free space is
consolodated, then create the 512M pagefile and re-defrag to see that
it's not stuck at the far end of the volume for some reason.
I do this early in the build process, so pagefile becomes embedded
midway between the first-installed code files and the new-file edge.
Because the whole volume is only 8G (< 10% of HD size), no matter how
wrong I get this, it will be better than one big C:
temporary internet files.
Best thing to do with TIF is set a sensible capacity limit, e.g. 20M.
Be prepared to chase after newly-created accounts to fix, as new
accounts always start with MS duhfaults and each has its own TIF
Another point I would make is that System Restore creates unnecessary
large files daily so their regular removal using the disk cleanup option
is good practice to ensure that a healthy amount of free space is
maintained in the system partition.
Because I have core code on C: only, I disable SR on all other
volumes. That alone reduces the SR bulk factor; I reduce that further
by setting a lower max capacity, say 433M.
If you use the My Documents folder you can change the default to a data
folder to remove constantly changing files from the default location,
which is where the system files are located.
I want the "My Docs" object to be small and clean, so I can back it up
easily, and restore it without fear of embedded malware.
So I override the default shell folder locations, as follows:
- "My Docs" to D:
- "My Pics", "My Music", "My Videos" out of "My Docs" to E:
- all incoming material OUT of "My Docs" to E:\Incoming
- all desktops to E:\Incoming\Desktops
Part of keeping incoming material, and thus malware, out of the data
and backup sets is to avoid apps that mix your data and other ppl's
incoming junk. No Outlook .PST, no OE mailboxes - instead, I use
Eudora, keep the mail data in "My Docs" and attachments in E:\Incoming
And if folks dump on the desktop, it doesn't matter anymore, because
they are no longer stinking up C: - and you can bet E:\Incoming is
permanently under the cross-hairs of my antivirus apps
Now it's easy; a Task runs a .bat that automates an archiver to crunch
"My Docs" into a .ZIP that fits easily on USB stick or CDRW.
Trouble is, every time someone creates a new user account, that
account will screw this up completely until it's manually fixed.
Forced with a choice between risk management and the safety benefits
of limited user accounts, I chuck user accounts overboard first.
---------- ----- ---- --- -- - - - -
On the 'net, *everyone* can hear you scream