Macrium backup larger than contents of drive...?

D

Doc

I've been using Macrium Reflect for a while and it's worked fine up to
now, but just encountered an odd glitch. I set it on max compression
but the backup file is actually larger than the h/d contents. This is
the first time I've encountered this.

Any suggestions?

Thanks.
 
P

Paul

Doc said:
I've been using Macrium Reflect for a while and it's worked fine up to
now, but just encountered an odd glitch. I set it on max compression
but the backup file is actually larger than the h/d contents. This is
the first time I've encountered this.

Any suggestions?

Thanks.

The software designer has some choices.

For example, if the designer noticed that the compressor does
a poor job, on a section of the archive, the designer has the
option of storing the uncompressed version instead. Which would
cap the size of the output archive, to be no bigger than the
original files. Then, the output archive, might be no more than
say 1% larger, than the original files. There'd be some overhead.

It sounds like, in this case, the output of the compressor,
is being accepted, no matter what it puts out (big or small output).

*******

You could:

1) Create a backup with compression disabled.
2) Run a copy of 7ZIP over the output file yourself.

That would take forever, but it would give good compression.
It would also complicate bare metal recovery later, when you
want to restore from backup. (Have to decompress the 7ZIP
first, then run Macrium later to do the restore. Probably
requiring an extra hard drive, and a Linux LiveCD.)

I've used such methods, but they're not exactly convenient.
You're probably better off just dealing with the bloated
backup instead.

*******

Macrium has a forum, and you can ask about the behavior over
there.

These are some key technologies involved. The second one, when
the backup runs, the output could be emitted in cluster
order, rather than in file order. So the compressor may
not be looking at an entire, contiguous file when it
does a compression. It could be looking at clusters from
different files, sitting next to one another (if the original
disk is fragmented). [And no, this is not a suggestion to
defragment the disk first... :) It's a partial explanation
of the tough job the developer has to do, to make good
compressions.] As far as I know, Macrium is not a "file by file"
backup tool, the way NTBACKUP might be. So perhaps it would be
a trifle complicated, to have the compressor pick and
choose what to compress.

http://en.wikipedia.org/wiki/Macrium_Reflect

"Abraham Lempel LZ based compression..."

http://en.wikipedia.org/wiki/Volume_Snapshot_Service

A bloated archive, would result from the developer doing
something like this - simple pipelining of processes, with
no decision making along the way.

disk ---> VSS ---> ALZ ---> write_to_disk

HTH,
Paul
 
D

Desmond

Doc said:
I've been using Macrium Reflect for a while and it's worked fine up to
now, but just encountered an odd glitch. I set it on max compression
but the backup file is actually larger than the h/d contents. This is
the first time I've encountered this.
Any suggestions?

The software designer has some choices.

For example, if the designer noticed that the compressor does
a poor job, on a section of the archive, the designer has the
option of storing the uncompressed version instead. Which would
cap the size of the output archive, to be no bigger than the
original files. Then, the output archive, might be no more than
say 1% larger, than the original files. There'd be some overhead.

It sounds like, in this case, the output of the compressor,
is being accepted, no matter what it puts out (big or small output).

*******

You could:

1) Create a backup with compression disabled.
2) Run a copy of 7ZIP over the output file yourself.

That would take forever, but it would give good compression.
It would also complicate bare metal recovery later, when you
want to restore from backup. (Have to decompress the 7ZIP
first, then run Macrium later to do the restore. Probably
requiring an extra hard drive, and a Linux LiveCD.)

I've used such methods, but they're not exactly convenient.
You're probably better off just dealing with the bloated
backup instead.

*******

Macrium has a forum, and you can ask about the behavior over
there.

These are some key technologies involved. The second one, when
the backup runs, the output could be emitted in cluster
order, rather than in file order. So the compressor may
not be looking at an entire, contiguous file when it
does a compression. It could be looking at clusters from
different files, sitting next to one another (if the original
disk is fragmented). [And no, this is not a suggestion to
defragment the disk first... :) It's a partial explanation
of the tough job the developer has to do, to make good
compressions.] As far as I know, Macrium is not a "file by file"
backup tool, the way NTBACKUP might be. So perhaps it would be
a trifle complicated, to have the compressor pick and
choose what to compress.

http://en.wikipedia.org/wiki/Macrium_Reflect

    "Abraham Lempel LZ based compression..."

http://en.wikipedia.org/wiki/Volume_Snapshot_Service

A bloated archive, would result from the developer doing
something like this - simple pipelining of processes, with
no decision making along the way.

     disk ---> VSS ---> ALZ ---> write_to_disk

HTH,
    Paul

Hi this is interesting topic never used Macrium Reflect. I have always
used Norton Ghost and to date it has never let me down. Does Macrium
Reflect offer beter ]?

Desmond.
 
P

Paul

Desmond said:
Doc said:
I've been using Macrium Reflect for a while and it's worked fine up to
now, but just encountered an odd glitch. I set it on max compression
but the backup file is actually larger than the h/d contents. This is
the first time I've encountered this.
Any suggestions?
Thanks.
The software designer has some choices.

For example, if the designer noticed that the compressor does
a poor job, on a section of the archive, the designer has the
option of storing the uncompressed version instead. Which would
cap the size of the output archive, to be no bigger than the
original files. Then, the output archive, might be no more than
say 1% larger, than the original files. There'd be some overhead.

It sounds like, in this case, the output of the compressor,
is being accepted, no matter what it puts out (big or small output).

*******

You could:

1) Create a backup with compression disabled.
2) Run a copy of 7ZIP over the output file yourself.

That would take forever, but it would give good compression.
It would also complicate bare metal recovery later, when you
want to restore from backup. (Have to decompress the 7ZIP
first, then run Macrium later to do the restore. Probably
requiring an extra hard drive, and a Linux LiveCD.)

I've used such methods, but they're not exactly convenient.
You're probably better off just dealing with the bloated
backup instead.

*******

Macrium has a forum, and you can ask about the behavior over
there.

These are some key technologies involved. The second one, when
the backup runs, the output could be emitted in cluster
order, rather than in file order. So the compressor may
not be looking at an entire, contiguous file when it
does a compression. It could be looking at clusters from
different files, sitting next to one another (if the original
disk is fragmented). [And no, this is not a suggestion to
defragment the disk first... :) It's a partial explanation
of the tough job the developer has to do, to make good
compressions.] As far as I know, Macrium is not a "file by file"
backup tool, the way NTBACKUP might be. So perhaps it would be
a trifle complicated, to have the compressor pick and
choose what to compress.

http://en.wikipedia.org/wiki/Macrium_Reflect

"Abraham Lempel LZ based compression..."

http://en.wikipedia.org/wiki/Volume_Snapshot_Service

A bloated archive, would result from the developer doing
something like this - simple pipelining of processes, with
no decision making along the way.

disk ---> VSS ---> ALZ ---> write_to_disk

HTH,
Paul

Hi this is interesting topic never used Macrium Reflect. I have always
used Norton Ghost and to date it has never let me down. Does Macrium
Reflect offer beter ]?

Desmond.

The advantage of Macrium Reflect, is it uses VSS, which allows it
to back up C: while the OS is running. Busy files are not
a problem for VSS.

As far as I know, older versions of Ghost, would require a reboot
into the Ghost tools. Before VSS was invented, C: would have had
busy files, and the busy files could not be backed up by conventional
backup tools. Booting into the tool to do the backup, was the solution
at that time. But then, you couldn't use the computer, while the backup
is running.

I took some screenshots, of a Macrium backup and restore operation.
(There is one typo in my descriptive text, near the bottom.)
Hold your mouse over the filmstrip and click to magnify (in Firefox).
What this shows, is the backup and restore of a test install of W7
in a virtual machine. That's how I can make screenshots.

http://img31.imageshack.us/img31/4512/macriumrestore.gif

Macrium includes a bootable CD image, and also offers the option
to build a WinPE disc of some sort. That requires the download
of a WAIK kit from Microsoft, a 1GB download. The built-in
bootable CD (which doesn't require you to build anything), worked
fine for me, and doesn't require that huge download. So there are
two methods of making a bootable recovery CD for bare metal
restores, and I just used the easy (included) one. It's a good idea
to test that the CD boots of course.

Paul
 
Y

Yousuf Khan

I've been using Macrium Reflect for a while and it's worked fine up to
now, but just encountered an odd glitch. I set it on max compression
but the backup file is actually larger than the h/d contents. This is
the first time I've encountered this.

Any suggestions?

Is it possible that you are backing up directory junctions or file
softlinks that point to directories on other drives, or even files that
are duplicated within the same filesystem?

Yousuf Khan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top