See you are wrong, again.
It is my experience that there isn't much of "wrong" or "right" outside of
the realm of religion, and especially not in engineering. It's more a
matter of "more or less" or finding out the appropriate qualifier,
mostly...
Files are cached. Caching makes fragmentation irrelevent.
Correct, but restricted to files that fit into the cache. A single
compilation and debugging run may touch many hundreds of MB of files, and
not all of this will fit into a typical cache. Same goes for audio and
video -- and other than compilation runs these are pretty mainstream.
So that leaves files read for the first time. Your OS files will be
probably unfragmented after install.
Correct, but probably not a real-world scenario. How many updates has a
typical user installed since the OS installation? Probably at least SP2,
and hopefully all of the critical updates as they are released. A
(significant?) portion of the system files therefore will be fragmented on
a typical system where temp and data files reside on the same partition as
the system files.
Same goes for applications installed after some time of running the system.
Not everybody installs everything before doing anything with the system. So
the application executables will be fragmented, too, which probably has an
effect on application load times.
That leaves data files. How many data files does one read at a time. I
only load one word doc at a time. Even if it's in a thousand pieces it
will still load faster than I can react.
Correct, if you're talking about a single Word file. But while that's
probably a quite typical scenario, there are others. Uncompressed databases
can reach substantial sizes, the debug files compilers generate (and the
debug versions of the libraries) are quite big, media files can have sizes
that make disk performance beyond caching significant. In all these cases,
seek times can take up a significant portion of the access time if the
files are highly fragmented. And file access time can take up a significant
portion of the overall processing time.
And then there's of course the page file, which you didn't mention at all.
I don't know whether, or if so, how, page file access is cached by the
system's disk access cache, but it doesn't really matter. If it is, the
size of the cache available for other files is (possibly substantially)
reduced, and all the problems with limited cache size mentioned above just
become worse. If it isn't (and also if it is, but to a lesser degree,
because usually the page file will be bigger than the cache), the
fragmentation of the page file obviously influences access times to its
content.
So it seems to me that while caching is a very good thing to help lessen
the impact of disk performance on overall system performance, it is not a
technique that makes disk and file access performance completely irrelevant
in all scenarios. The impact of fragmentation of course depends on the
relevant use cases, the amount of memory available for the cache and the
disk seek times, among other factors.
Think about it also from a different angle... Saying that defragging
doesn't matter is at least very close to, if not the same as saying that
disk seek times don't matter. Yet disk manufacturers still strive to
minimize seek times. Why would they do that? Victims of a general
misunderstanding of the real priorities? Or just technically insignificant
marketing? Not impossible, of course.
Gerhard