Partitioning a 2tb hd for Windows 7 64 bit

P

Paul

Ed said:
What do you think of Smart Defrag? It's free and appears to do that stuff.

http://www.iobit.com/iobitsmartdefrag.html

There are two aspects to defragmenters.

1) Actual defragmentation. To defragment a file, all a file needs is to be
contiguous. There is nothing to say it has to be placed at a particular
location on the disk. An example of a "pure" defragmenter, is Sysinternals
"contig" program. It just tries to put the file into a set of clusters
next to one another, and considers the job to be done at that point. If
you then used a "defrag map", there would be green dots all over the place
(no apparent order, but no fragmentation visible).

2) The second aspect, is "optimization". For example JKDefrag would move
big files to one area of the disk, folders somewhere else and so on. These
would be "optimization policies". The Windows built-in might "push everything
to the left" as its optimization policy. Optimization is a large part of
what distinguishes the various third-party defragmenters. That, and their
execution speed.

Some defragmenters even know how to move file system metadata, which is actually
a handy feature, because it makes it easier to "shrink" the file system later.
Apparently, Microsoft doesn't know how to do this (to the same extent),
while some defragmenter developers have figured it out.

The OS has a set of APIs for defragmentation. Most defragmentation products,
will be using these, because of their "safety" aspects. A side effect of "safety",
is relatively small sized disk commands are issued, as the defragmenter works.
On my disks here (cheapo disks), I get anywhere between 1MB/sec to 3MB/sec as
the defragmenter works. Frequently, it's better to use other techniques than
defragmentation, to clean up a file system. When a defragmenter isn't finished,
after an all-night run, that's when it's time to use something other than
a defragmenter to do the job.

The Windows 7 defragmenter, finishes in no time. And even if I haven't run it
in a while, it doesn't seem to take that long to finish. It's not at all like
my WinXP experience.

Paul
 
J

Joseph Terner

What is the best way to partition a 2tb hard drive for Windows 7 64 bit?
I am thinking of a small "C" drive for Windows and programs and a big
"D" for all my data. Another possibility is a small "C" drive for
Windows. A medium sized "D" for my programs and a big "E" for all my
data.

Which do you think is better? What size do you think I should make each
logical drive? Thank you in advance for all replies.

Put files for fast read-only access on the outside of the spindle. In
your case this is a small partition for "Windows and your
programs" (There's no difference, because Windows is also a collection of
programs). Put big files, where access speed isn't an issue (downloads,
installation files, disk images, video files etc.), on the inside. Put
the remainder in between.

Leave some space unallocated if you don't need it from the start and
create additional partitions or resize existing ones later as needed.

Joseph
 
R

Rod Speed

Joseph Terner said:
Daniel Prince wrote
Put files for fast read-only access on the outside of the spindle. In
your case this is a small partition for "Windows and your programs"

Waste of time if you have enough of a clue to only reboot every month
or so, and don’t close apps when you stop using them for a while.
(There's no difference, because Windows is also a collection of programs).

That’s just plain wrong with what gets loaded at boot time.
Put big files, where access speed isn't an issue (downloads,
installation files, disk images, video files etc.), on the inside.

No point in doing that.
Put the remainder in between.

No point in doing that either. It makes more sense
to have the non OS and apps on the outer of the
drive too because they mostly do get used at times.
Leave some space unallocated if you don't need it from the start

That makes no sense either.
and create additional partitions

That makes no sense either.
or resize existing ones later as needed.

Easier said than done to do that safely if it’s the only 2TB drive you have.
 
A

Alias

The problem with memory bit rot is when critical system files get read
into bad memory then written back out to disk. You get a slow,
insidious corruption of the system and the subsequent lockups and
crashes are put down to "Windows just does that".

I never back up the system, only data.
 
R

Rod Speed

just the puerile shit that’s all it can ever manage when
its got done like a ****ing dinner, as it always is.
 
R

Rod Speed

GreyCloud said:
Rod Speed wrote
Yet I have provided a first time study into the matter.

It isnt anything like first time.
"A two-and-a-half year study of DRAM on 10s of thousands Google servers
found DIMM error rates are hundreds to thousands of times higher than
thought — a mean of 3,751 correctable errors per DIMM per year.

That last is a fantasy and its completely trivial to prove
that the average PC doesn’t get anything like that.
This is the world’s first large-scale study of RAM errors in the field.

Wrong, as always.
It looked at multiple vendors, DRAM densities and DRAM types including
DDR1, DDR2 and FB-DIMM.
Every system architect and motherboard designer should read it carefully."

Anyone with even half a clue can check if the PCs they are using
get anything like that error rate and use CRCs to ensure that if it
happens, you get notified that its happened.
 
R

Rod Speed

Funny that my pair of old DEC VAX4000s do have this built into the
hardware.

Its just one way of doing it, and the most expensive way to do it.
Guess that is why they can achieve B3 security certification while the
best UNIX can do is C.
You are way out of your league in this one.
It is rather obvious that you don't know what you are talking about.

Ifs obvious that you don't have a ****ing clue about
how to check if the file has got corrupted, for peanuts
 
R

Rod Speed

I'm sure you are then aware that even tho it was considered 32-bit in its
time, the actual ram and registers were 36-bits wide?
It used the extra 4 bits to not only detect errors but also to correct
these errors... and if it happened too much you'd see it in the error
logs. On the hard drive if this happened then the OS would map out the
defective areas so that the data would not go there any more.
Get too many of these and it is time to replace the hard drive. Same goes
for ram.
It was a very good system and it was a boon to mission critical services.
During the 9/11 disaster, these systems manages to save all of the markets
transactions because they are fault tolerant and disaster recovery
systems.

All completely irrelevant to how you can ensure that
files have not been corrupted much more cheaply.
They do cost more...

Not enough more to matter, and CRCs for the files costs nothing.
the mobo that will support it costs more, along with the ECC ram.

But nothing like your stupid $10K claim.
And mostly found in the cheaper servers.

Because there are better and cheaper ways to ensure that
the files arent corrupted.
Sure there is.
But you never mention what the checks are.

Everyone can see for themselves that you are lying on that last.

By claiming that there is only one way to detect corruption of files.
And if you do it in hardware, you won't have to do this on your own.

Stupid doing it in hardware. You can do it in software for free.
That is why ZFS file system for now on Intel hardware is about the best
you are going to get.

Wrong, as always. Its just one way of ensuring corruption doesn't happen.
 
R

Rod Speed

GreyCloud said:
And using another route thru a german server.

Wrong, as always. It just happens to be the best
and cheapest reliable usenet server around, fool.
 
E

Ed Light

Also, something I'm looking at for pictures, is M-Disc DVD's. Writers
are just $10 more than the "normal" DVD writers, the disk are about $3
each, but I don't have THAT many pictures, and the disk will outlast
about anything, unless you try to destory them. That would be for items
that never change. like pictures.

Nero can do the Secure Disk thing (maybe it requires certain writers -
my Optiarc 7241S is ok), which I haven't tried, but you can control the
amount of extra error correction data it puts in, looking forward to
when some bits become unreadable.
Ah, well done!

Thanks! (Patting self on back.) :)

About Roddy:
He breaks paragraphs
so he can take things out of context almost all the time. It's fine to
remove parts you aren't responding to, but changing the meaning by
carfully editing is just being brain dead.

Too insidious of the vituperative one.
--
Ed Light

Better World News TV Channel:
http://realnews.com

Iraq Veterans Against the War and Related:
http://ivaw.org
http://couragetoresist.org
http://antiwar.com

Send spam to the FTC at
(e-mail address removed)
Thanks, robots.
 
E

Ed Light

I'd have to try it out to really tell you for sure, but feature set looks
OK. ....
I also don't like the realtime defrag, as it can mess with real time ops,
but that's from personal experience, and probibly outdated.

I, too, totally can't image defragging going on during using the
computer. On all my builds for friends, I turn it off in Windows. I'm
assuming it can be turned off in SmartDefrag. There is a self-contained,
not installed to Windows, version at portableapps.com. I've used it in
the past briefly and it can do a deep defrag where it intends to speed
up everything by what's used most, etc.

I tell the friends to defrag and even give them an icon that instantly
runs jkdefrag, but they never do it, and their computers get kind of
slow by the twice-a-year maintenance.

As to the Win 7 defragger, I think it works with Superfetch to optimize
things that you use alot. Or maybe Superfetch just does that
independently. But someone said that defragger's not so great.

I used to disable Superfetch, but I'm trying it out in case I get
quicker boots. It may take awhile for it to learn my habits and
rearrange things.

--
Ed Light

Better World News TV Channel:
http://realnews.com

Iraq Veterans Against the War and Related:
http://ivaw.org
http://couragetoresist.org
http://antiwar.com

Send spam to the FTC at
(e-mail address removed)
Thanks, robots.
 
J

Joseph Terner

On Sun, 27 May 2012 20:16:54 -0400, charlie wrote:
[Rod's nonsense snipped]
The only "safe" thing to do with a 2TB drive is to back it up regularly.
Otherwise, a very large loss of data, etc. is possible.

Separating data by usage pattern and making backup and restore (!) easier
is the whole point in partitioning such a large drive since decades.

Despite this Windows 7 supports resizing partitions out of the box, so
you can extend partitions as needed.

Joseph
 
R

Rod Speed

Joseph Terner said:
charlie wrote
Separating data by usage pattern and making backup and restore (!)
easier is the whole point in partitioning such a large drive since
decades.

Even sillier. There is no point in partitioning when
any decent backup app can do incremental backups.
Despite this Windows 7 supports resizing partitions out
of the box, so you can extend partitions as needed.

And if you are actually stupid enough to do that without an
image of the entire physical drive, you get what you deserve
when it all goes pear shaped.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top