3
votes

I have a 3rd-party API method that takes a stream as an input parameter and processes it (reads it). The stream I'm giving in is potentially very huge (aka doesn't fit into memory). I'd like to take this stream and wrap it into some compression stream that will do the compression on-the-fly as the method reads the stream.

Is there any library that can do that? The compression doesn't need to be super-efficient, something ZIP-like is enough.

Example:

using (var data = File.OpenRead(...))
{
  using (var pack = new PackStream(data))
  {
    3rdPartyApiMethod(pack);
  }
}

void 3rdPartyApiMethod(Stream s)
{
  // process the stream until EOF
  // "while(true) s.Read"
}

So the theoretical PackStream should read the data from underlying stream (as less as possible), pack it and return to whoever asked.

Update based on comments (see below for full story):
What the method is doing is reading data in blocks and uploads each block into server. One at a time. Just storing it there. Because my file is about ~100x bigger then the block and even simple ZIP compression was able to compress the file to ~1/10th I believe compressing it is a good idea.

1
System.IO.Compression.GZipStream ? - I4V
@I4V I leaped at that too, but the problem is that GZipStream, in compression mode, will want to write to data, not read from it. It will only read from data in decompression mode. - Marc Gravell♦

1 Answers

1
votes

Use DotNetZip's ZlibStream. zlib is designed for stream compression.