GZipStream Decompression Failure

G

Guest

Hi,

I have been using the new GZipStream classes, and have been experiencing
problems when attemping to decompress files, which from experience, seem to
be failing when the original file size exceeds something like 64 MB.

For example, when I attempt to decompress a text file of size 1.18 MB to its
original size of 106 MB, I receive the following error message:

System.IO.IOException: Insufficient system resources exist to complete the
requested service.
Source: mscorlib
StackTrace: " at System.IO.__Error.WinIOError(Int32 errorCode, String
maybeFullPath)\r\n at System.IO.FileStream.WriteCore(Byte[] buffer, Int32
offset, Int32 count)\r\n at System.IO.FileStream.Write(Byte[] array, Int32
offset, Int32 count)\r\n at %MethodName% in %SourceFilePath%:line 96"
TargetSite: {Void WinIOError(Int32, System.String)}

However, it works perfectly fine for files of an original size which is,
approximately, less than 64 MB in size.

Can anyone kindly provide any assistance as to whether there is a solution,
or whether there is just a flaw in the class?

Thanks
 
J

Jon Skeet [C# MVP]

I have been using the new GZipStream classes, and have been experiencing
problems when attemping to decompress files, which from experience, seem to
be failing when the original file size exceeds something like 64 MB.

Can anyone kindly provide any assistance as to whether there is a solution,
or whether there is just a flaw in the class?

Well, you haven't shown any of the code you're using to decompress.

Could you produce a short but complete program which demonstrates the
problem?
See http://pobox.com/~skeet/csharp/complete.html for what I mean by
that.

Jon
 
G

Guest

The Method:

public static void DecompressFile( FileInfo FileToDecompress, String
DestinationDirectory, Boolean DeleteOriginalFile)
{
FileStream fsSource = null;
FileStream fsDestination = null;
GZipStream compressedStream = null;

try
{
Byte[] buffer;

fsSource = new FileStream( FileToDecompress.FullName, FileMode.Open,
FileAccess.Read, FileShare.Read);

// The original file size may be obtained from the footer of the
compressed file
buffer = new Byte[ 4];
fsSource.Position = Convert.ToInt32( fsSource.Length) - 4;
fsSource.Read( buffer, 0, 4);
Int32 _OriginalFileSize = BitConverter.ToInt32( buffer, 0);

// Read the decompressed file contents
buffer = new Byte[ _OriginalFileSize];
fsSource.Position = 0;

compressedStream = new GZipStream( fsSource, CompressionMode.Decompress,
true);
compressedStream.Read( buffer, 0, _OriginalFileSize);
compressedStream.Flush();
compressedStream.Close();

// Write the decompressed data to a new file
String _OriginalFileName = FileToDecompress.Name.Substring( 0,
FileToDecompress.Name.Length - 5);
fsDestination = new FileStream( DestinationDirectory +
Path.DirectorySeparatorChar + _OriginalFileName, FileMode.Create,
FileAccess.Write, FileShare.Write);
fsDestination.Write( buffer, 0, _OriginalFileSize);
fsDestination.Close();
fsSource.Close();

System.Diagnostics.Debug.WriteLine( String.Format( "The file {0} was
decompressed as {1}.", FileToDecompress.Name, _OriginalFileName));

if( DeleteOriginalFile)
{
FileToDecompress.Delete();
System.Diagnostics.Debug.WriteLine( String.Format( "The following file
was deleted: {0}", FileToDecompress.Name));
}
}
catch( System.Exception ex)
{
System.Diagnostics.Debug.WriteLine( String.Format( "The following
exception occurred within the application:\n{0}", ex.Message));
}
finally
{
if( fsSource != null) fsSource.Close();
if( compressedStream != null)
{
compressedStream.Flush();
compressedStream.Close();
}

if( fsDestination != null) fsDestination.Close();
}
}
 
J

Jon Skeet [C# MVP]

The Method:

Please see http://pobox.com/~skeet/csharp/incomplete.html

However, it looks to me like the problem is that you're trying to read
the whole thing in one go. Aside from anything else, this is
horrendously inefficient in terms of memory. It's much better to use a
buffer, read into that, write from it, then repeat until you're done.

I've got a method in my MiscUtil library for copying a whole stream.
See
http://pobox.com/~skeet/csharp/miscutil

It won't *quite* work out of the box in this case because you've
appended the original size to the end of the compressed stream. Is
this absolutely required? If you really need the information, could
you not put it at the *start* of the stream rather than the end? (That
way it can be skipped over very easily, rather than giving a
compressed stream which has invalid data at the end.)

Jon
 
M

Marc Gravell

Yowser! Buffer bomb!

You shouldn't be allocating a buffer for the entire file (which could
be huge), but rather just create a small buffer and loop over the
data - something like (untested):

static void Main() {
using(Stream inFile = File.OpenRead("in.gzip"))
using(GZipStream zip = new GZipStream(inFile,
CompressionMode.Decompress))
using (Stream outFile = File.OpenWrite("out.whatever")) {
CopyStream(zip, outFile);
outFile.Close();
}


}
static long CopyStream(Stream source, Stream destination) {
if (source == null) throw new ArgumentNullException("source");
if (!source.CanRead) throw new ArgumentException("Cannot read
from source");
if (destination == null) throw new
ArgumentNullException("destination");
if (!destination.CanWrite) throw new ArgumentException("Cannot
write to destination");
const int BUFFER_SIZE = 4096;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead;
long totalBytesRead = 0;
while ((bytesRead = source.Read(buffer, 0, BUFFER_SIZE)) > 0)
{
destination.Write(buffer, 0, bytesRead);
totalBytesRead += bytesRead;
}
destination.Flush();
return totalBytesRead;
}

Marc
 
G

Guest

KABOOOOM!

OK, shoot me now!

I'm sure its not that obvious that I'm new to Streams... yeah right!
Although I didn't & don't totally understand streams, I do recall when I
first started carving the code, I suspected that it might be dumping the
whole buffer at once. Obvious really when you take a careful look at the
code!

I've altered it slightly so that it now only reads and writes a small buffer
at a time, and it works a treat.

Many thanks guys, you've given me the slap I needed!
 
M

Marc Gravell

[being new to streams]
Please rest assured that I also learnt this the hard way. No doubt Jon
or one of the other stalwarts set me on the right path - sorry if it
sounded condascending, it wasn't my intent - just to stress that a
radical change of direction was needed ;-p
Many thanks guys, you've given me the slap I needed!
No problem

Marc
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top