4
votes

I'm using the OpenSSL Crypto library in my C# project to encrypt/decrypt files. Here is my code :

byte[] key = System.Text.Encoding.ASCII.GetBytes("password");

byte[] iv = System.Text.Encoding.ASCII.GetBytes("1234");


OpenSSL.Crypto.CipherContext cc = new OpenSSL.Crypto.CipherContext(
    OpenSSL.Crypto.Cipher.AES_256_ECB);

FileStream fIn = new FileStream("C:\\file.txt", FileMode.Open, 
    FileAccess.Read);
FileStream fOut = new FileStream("C:\\encrypted.txt", FileMode.OpenOrCreate,
    FileAccess.Write);
fOut.SetLength(0);

byte[] bin = new byte[100];
long rdlen = 0;
long totlen = fIn.Length;
int len;

DateTime start = DateTime.Now;
while (rdlen < totlen)
{
    // argument 1
    len = fIn.Read(bin, 0, 100);         
    // argument 2
    fOut.Write(cc.Crypt(bin,key,iv,true),0,100);                 
    rdlen = rdlen + len;
}

fOut.Flush();  
fOut.Close();
fIn.Close();

As a result I got this exception:

Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection.

When I changed the values of argument 1 and 2 from 100 to 64 (bin still always byte[100]) it worked, the file was encrypted and decrypted but the size of the decrypted file was bigger than the original one and contained 1 or 2 more lines at the end of the text file.

2

2 Answers

3
votes

I don't know the library but one problem here is that you're encrypting chunks of 100 bytes with a 256-bit = 32 byte block size. Your chunks should be multiples of 32 bytes. The extra bytes at the end of the file are likely just rounding the final block up to 32 bytes.

As in Philip's answer the likely cause of the crash is the hard-coded 100 in the Write, though. The Crypt function will be returing one of 32, 64 or 96 bytes from the final block it's encrypting, which are all short of 100. (In the 100 byte case that works, chances are your data is being padded out to 128 bytes encrypted and so you're losing the last 28 bytes of the last block when you only write 100.)

Also

  1. you're passing in an IV in ECB mode - you don't need this
  2. by repeatedly calling Crypt you're presumably doing key setup for every 100 bytes. This is inefficient; you only need to do it once at the start of the encryption. You should look for a way to initialise the class with the key (and IV in other modes) then feed it blocks of data to encrypt at once, rather than calling Crypt with the key every time. I don't know what this would be in this library but it ought to exist. As it stands you also can't use CBC or any similar mode because you'll be writing the IV every 100 bytes rather than once and not chaining the last block between adjacent 100 byte chunks.
  3. if you want to use Crypt, why not just load the file into memory at once? I realise that won't scale to gigabytes of data but it's likely possible in your normal use case. Or at least pick a much larger data size, e.g. 256k. However you'd still be faced with the repeated key set-up / broken CBC if you go above one block.
1
votes

When you specify 100 to your call to fIn.Read(...), you are saying "Read up to 100 bytes" - note that the actual number of bytes may be different; You should use the return value to determine how many were actually read.

And in the call to fOut.Write, you are assuming that the output of cc.Crypt(bin,key,iv,true) will be exactly 100 bytes, which is not a valid assumption. Also note that you are always encrypting all 100 bytes of bin, even if you only read 1 from your file. If you did read less than 100, you would be encrypting whatever was "left over" in bin (probably 0s, unless previously used).

Fix these length issues, and you should be on the right track. Something like:

while (rdlen < totlen)
{
    len = fIn.Read(bin, 0, 100);         
    // note that we are encrypting all of "bin" here, may want to only 
    // encrypt "len" bytes..
    byte[] encrypted = cc.Crypt(bin, key, iv, true);
    fOut.Write(encrypted, 0, encrypted.Length);                 
    rdlen = rdlen + len;
}