P
(PeteCresswell)
My media server is recording TV to a couple of 2-TB onboard
discs.
In trying to track down some weirdnesses in playback, I noticed
that both discs were heavily fragmented:
http://tinyurl.com/d4r7v84
Each pass of defragging only consolidates a very small percentage
of the space. It seems insignificant for a given pass.
Assuming that defragging is warranted, I'm thinking two
possibilities:
- Some sort of .Bat or .Cmd file to iteratively defrag on the
assumption that eventually a disc's data can be mostly
consolidated.
- Buy another 2-TB disc and, instead of de-fragging a disc,
copy it's contents to the third disc and then replace it with
the third disc.
The Questions:
- Could defragging actually have a perceptible effect? Or is it
one of those things that sounds good in theory but has limited
practicality - at least in the situation of media playback?
- Given that several large new files (seems like about six gigs
for an hour of HD) are being created each day and a like number
deleted on a per-day average over each week, might
fragmentation just be a fact of life and trying to stay on top
of it be something of a fool's errand?
- Is there any hope for the iterative approach? Or is there some
inherent limit determined by % usage?
discs.
In trying to track down some weirdnesses in playback, I noticed
that both discs were heavily fragmented:
http://tinyurl.com/d4r7v84
Each pass of defragging only consolidates a very small percentage
of the space. It seems insignificant for a given pass.
Assuming that defragging is warranted, I'm thinking two
possibilities:
- Some sort of .Bat or .Cmd file to iteratively defrag on the
assumption that eventually a disc's data can be mostly
consolidated.
- Buy another 2-TB disc and, instead of de-fragging a disc,
copy it's contents to the third disc and then replace it with
the third disc.
The Questions:
- Could defragging actually have a perceptible effect? Or is it
one of those things that sounds good in theory but has limited
practicality - at least in the situation of media playback?
- Given that several large new files (seems like about six gigs
for an hour of HD) are being created each day and a like number
deleted on a per-day average over each week, might
fragmentation just be a fact of life and trying to stay on top
of it be something of a fool's errand?
- Is there any hope for the iterative approach? Or is there some
inherent limit determined by % usage?