is there a better XP Defrag..?

A

Andrew

is there a better Defrag for XP Home..?

I like the old one, which you could see it moving the clusters around.. and
thus
be able to guess how long it would take.

Is it possible to just Defrag certain files or folders...?
bet that's a silly Q. do you just move file from one drive to the other and
back..?
 
C

Carey Frisch [MVP]

PerfectDisk Version 7.0
http://www.raxco.com/products/perfectdisk2k/

Try out PerfectDisk free, for 30-days
http://www.raxco.com/products/downloadit/perfectdisk2000_download.cfm

--
Carey Frisch
Microsoft MVP
Windows XP - Shell/User
Microsoft Newsgroups

Get Windows XP Service Pack 2 with Advanced Security Technologies:
http://www.microsoft.com/athome/security/protect/windowsxp/choose.mspx

-------------------------------------------------------------------------------------------

:

| is there a better Defrag for XP Home..?
|
| I like the old one, which you could see it moving the clusters around.. and
| thus
| be able to guess how long it would take.
|
| Is it possible to just Defrag certain files or folders...?
| bet that's a silly Q. do you just move file from one drive to the other and
| back..?
| --------
| Andrew
 
P

Plato

Andrew said:
is there a better Defrag for XP Home..?

I like the old one, which you could see it moving the clusters around.. and
thus
be able to guess how long it would take.

In other words, you'd be willing to pay for bells and whistles.
 
E

Edward W. Thompson

Andrew said:
is there a better Defrag for XP Home..?

I like the old one, which you could see it moving the clusters around..
and
thus
be able to guess how long it would take.

Is it possible to just Defrag certain files or folders...?
bet that's a silly Q. do you just move file from one drive to the other
and
back..?
As a direct answer to your question the defrag utility included with WINXP
will defrag your system efficiently but more slowly than third party
software. If speed is of the essence then purchase 'perfect disk' or
'diskeeper' but they don't 'defrag' any better only more quickly and have a
pretty interface, all at a cost.
 
L

Leythos

Hi Andrew,

Nothing more is needed. Fact is, you don't need to run it at all in XP.

Kelly, you are wrong. While many people that don't run it never know
they needed to run it, it is something that can increase drive related
performance on a cluttered drive system. Defragmentation of files means
that the RW heads don't have to seek to another sector to finish loading
the file, which means there is less dead-time during a file read.

If you don't believe it, consider what FILE fragmentation really is, and
now a drive reads files.

While many people no have large drives that are only 30% used, there are
a great many people that have small drives that are more than 80% used,
and the same with people that have large drives with limited free space.

In either case, the number of fragments a file is broken into directly
impacts the file read performance.

The stripped down XP defrag program will defragment files, but it's not
that good at "packing" the drive. Many programs like O&O or Diskeeper
are specifically designed to move files to the beginning of the drive,
to defragment files, to defragment white-space, and have scheduling
ability in addition to the ability to not fragment files when possible.

I personally have seen many machines that were crawling (performance)
restored to a much higher performance state by using a quality defrag
tool (the XP one was never tried since DK or O&O was purchased).
 
P

Plato

Kelly said:
Nothing more is needed. Fact is, you don't need to run it at all in XP.

Well, one doesn't _need_ another defragger. But, it's stil good to run
the built in one every now and then. Assuming you delete
temp/tmp/internet cache files in advance.
 
L

Leythos

Am not going to argue with you. However, will stand by my previous
statement.

I was just hoping you would understand the technical aspects of what
file fragmentation means to read performance and how that a head moving
but not reading is wasted time/performance.

Maybe I can explain it better with a simple file copy/sort:

If you have one hard drive, you have one raw text file, say 500MB in
size, you want to sort it and then write it out to another file, the
operation (to the drive) is something like this:

1) Position heads for next read segment of data
2) Read some data
3) Sort routine acts on it
4) Position heads for next write segment of data
5) Write output file data
6) repeat 1-5 until complete

In this example, the heads have to move to reach file output file area
each time there is a write, this slows down the operation.

In a two drive example of the above:

1) Position D1 heads for initial read of data
2) Position D2 heads for initial write of data
3) D1 Read some data
4) Sort routine acts on it
5) D2 Write some data
6) Repeat 3-5 until complete

The key here is that the heads don't have to reposition on either drive,
they just keep moving in a sequential manner to the next sector, saving
LOTS of time.

This is overly simplified and doesn't account for multiple processes,
but you get the idea - it's about head movement where no data is being
read due to a file gap.

Once you understand how the "gap" impacts drive performance there is
nothing to argue about, it's simply a matter of read/can't read and the
time wasted to get to the next part of the fragmented file.
 
D

David Candy

See you are wrong, again.

Files are cached. Caching makes fragmentation irrelevent. So that leaves files read for the first time. Your OS files will be probably unfragmented after install. That leaves data files. How many data files does one read at a time. I only load one word doc at a time. Even if it's in a thousand pieces it will still load faster than I can react.

Now a file copy of lots of fragmented files will take a while longer. Big deal, how often does one do that. The first application startup and system startup will take longer, but prefetch reduces the effect of this. It may not be measurable.

Defragging a floppy only system pays good dividends, especially if smartdrv isn't being used. But as computers get faster and faster it doesn't matter.

I use perfect disk. I do it from habit and because I like tidy computers.
 
A

Andrew

Thanks Kelly,
p.s. your corner's a great source for all sorts of stuff..! inspiring.

| <LOL>
|
| --
|
| All the Best,
| Kelly (MS-MVP)
|
| Troubleshooting Windows XP
| http://www.kellys-korner-xp.com
|
|
|
| | >
| > | >> Hi Andrew,
| >>
| >> Nothing more is needed. Fact is, you don't need to run it at all in
XP.
| >
| > Is that a fact? So I'm smart, and I thought I was just lazy.. :>)
| >
| >
| >>
| >> --
| >>
| >> All the Best,
| >> Kelly (MS-MVP)
| >>
| >> Troubleshooting Windows XP
| >> http://www.kellys-korner-xp.com
| >>
| >>
| >>
| >> | >>> is there a better Defrag for XP Home..?
| >>>
| >>> I like the old one, which you could see it moving the clusters
around..
| >>> and
| >>> thus
| >>> be able to guess how long it would take.
| >>>
| >>> Is it possible to just Defrag certain files or folders...?
| >>> bet that's a silly Q. do you just move file from one drive to the
other
| >>> and
| >>> back..?
| >>> --------
| >>> Andrew
| >>>
| >>>
| >>
| >>
| >
| >
|
|
 
K

Kelly

Thank you, David.

--

All the Best,
Kelly (MS-MVP)

Troubleshooting Windows XP
http://www.kellys-korner-xp.com



"David Candy" <.> wrote in message
See you are wrong, again.

Files are cached. Caching makes fragmentation irrelevent. So that leaves
files read for the first time. Your OS files will be probably unfragmented
after install. That leaves data files. How many data files does one read at
a time. I only load one word doc at a time. Even if it's in a thousand
pieces it will still load faster than I can react.

Now a file copy of lots of fragmented files will take a while longer. Big
deal, how often does one do that. The first application startup and system
startup will take longer, but prefetch reduces the effect of this. It may
not be measurable.

Defragging a floppy only system pays good dividends, especially if smartdrv
isn't being used. But as computers get faster and faster it doesn't matter.

I use perfect disk. I do it from habit and because I like tidy computers.
 
G

Gerhard Fiedler

See you are wrong, again.

It is my experience that there isn't much of "wrong" or "right" outside of
the realm of religion, and especially not in engineering. It's more a
matter of "more or less" or finding out the appropriate qualifier,
mostly... :)
Files are cached. Caching makes fragmentation irrelevent.

Correct, but restricted to files that fit into the cache. A single
compilation and debugging run may touch many hundreds of MB of files, and
not all of this will fit into a typical cache. Same goes for audio and
video -- and other than compilation runs these are pretty mainstream.
So that leaves files read for the first time. Your OS files will be
probably unfragmented after install.

Correct, but probably not a real-world scenario. How many updates has a
typical user installed since the OS installation? Probably at least SP2,
and hopefully all of the critical updates as they are released. A
(significant?) portion of the system files therefore will be fragmented on
a typical system where temp and data files reside on the same partition as
the system files.

Same goes for applications installed after some time of running the system.
Not everybody installs everything before doing anything with the system. So
the application executables will be fragmented, too, which probably has an
effect on application load times.
That leaves data files. How many data files does one read at a time. I
only load one word doc at a time. Even if it's in a thousand pieces it
will still load faster than I can react.

Correct, if you're talking about a single Word file. But while that's
probably a quite typical scenario, there are others. Uncompressed databases
can reach substantial sizes, the debug files compilers generate (and the
debug versions of the libraries) are quite big, media files can have sizes
that make disk performance beyond caching significant. In all these cases,
seek times can take up a significant portion of the access time if the
files are highly fragmented. And file access time can take up a significant
portion of the overall processing time.

And then there's of course the page file, which you didn't mention at all.
I don't know whether, or if so, how, page file access is cached by the
system's disk access cache, but it doesn't really matter. If it is, the
size of the cache available for other files is (possibly substantially)
reduced, and all the problems with limited cache size mentioned above just
become worse. If it isn't (and also if it is, but to a lesser degree,
because usually the page file will be bigger than the cache), the
fragmentation of the page file obviously influences access times to its
content.


So it seems to me that while caching is a very good thing to help lessen
the impact of disk performance on overall system performance, it is not a
technique that makes disk and file access performance completely irrelevant
in all scenarios. The impact of fragmentation of course depends on the
relevant use cases, the amount of memory available for the cache and the
disk seek times, among other factors.


Think about it also from a different angle... Saying that defragging
doesn't matter is at least very close to, if not the same as saying that
disk seek times don't matter. Yet disk manufacturers still strive to
minimize seek times. Why would they do that? Victims of a general
misunderstanding of the real priorities? Or just technically insignificant
marketing? Not impossible, of course.

Gerhard
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top