I'm working on a piece of cross-platform (Windows and Mac OS X) code in C that needs to encrypt / decrypt blobs using AES-256 with CBC and a blocksize of 128 bits. Among various libraries and APIs I've chosen OpenSSL.
This piece of code will then upload the blob using a multipart-form PUT to a server which then decrypts it using the same settings in .NET's crypto framework (Aes, CryptoStream, etc...).
The problem I'm facing is that the server decryption works fine when the local encryption is done on Windows but it fails when the encryption is done on Mac OS X - the server throws a "Padding is invalid and cannot be removed exception".
I've looked at this from many perspectives:
- I verified that the transportation is correct - the byte array received on the server's decrypt method is exactly the same that is sent from Mac OS X and Windows
- The actual content of the encrypted blob, for the same key, is different between Windows and Mac OS X. I tested this using a hardcoded key and run this patch on Windows and Mac OS X for the same blob
- I'm sure the padding the correct, since it is taken care of by OpenSSL and since the same code works for Windows. Even so, I tried implementing the padding scheme as it is in Microsoft's reference source for .NET but still, no go
- I verified that the IV is the same for Windows and Mac OS X (I thought maybe there was a problem with some of the special characters such as ETB that appear in the IV, but there wasn't)
- I've tried LibreSSL and mbedtls, with no positive results. In mbedtls I also had to implement padding because, as far as I know, padding is the responsibility of the API's user
- I've been at this problem for almost two weeks now and I'm starting to pull my (ever scarce) hair out
As a frame of reference, I'll post the C client's code for encrypting and the server's C# code for decrypting. Some minor details on the server side will be omitted (they don't interfere with the crypto code).
Client:
/*++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*/
void
__setup_aes(EVP_CIPHER_CTX *ctx, const char *key, qvr_bool encrypt)
{
static const char *iv = ""; /* for security reasons, the actual IV is omitted... */
if (encrypt)
EVP_EncryptInit(ctx, EVP_aes_256_cbc(), key, iv);
else
EVP_DecryptInit(ctx, EVP_aes_256_cbc(), key, iv);
}
/*++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*/
void
__encrypt(void *buf,
size_t buflen,
const char *key,
unsigned char **outbuf,
size_t *outlen)
{
EVP_CIPHER_CTX ctx;
int blocklen = 0;
int finallen = 0;
int remainder = 0;
__setup_aes(&ctx, key, QVR_TRUE);
EVP_CIPHER *c = ctx.cipher;
blocklen = EVP_CIPHER_CTX_block_size(&ctx);
//*outbuf = (unsigned char *) malloc((buflen + blocklen - 1) / blocklen * blocklen);
remainder = buflen % blocklen;
*outlen = remainder == 0 ? buflen : buflen + blocklen - remainder;
*outbuf = (unsigned char *) calloc(*outlen, sizeof(unsigned char));
EVP_EncryptUpdate(&ctx, *outbuf, outlen, buf, buflen);
EVP_EncryptFinal_ex(&ctx, *outbuf + *outlen, &finallen);
EVP_CIPHER_CTX_cleanup(&ctx);
//*outlen += finallen;
}
Server:
static Byte[] Decrypt(byte[] input, byte[] key, byte[] iv)
{
try
{
// Check arguments.
if (input == null || input.Length <= 0)
throw new ArgumentNullException("input");
if (key == null || key.Length <= 0)
throw new ArgumentNullException("key");
if (iv == null || iv.Length <= 0)
throw new ArgumentNullException("iv");
byte[] unprotected;
using (var encryptor = Aes.Create())
{
encryptor.Key = key;
encryptor.IV = iv;
using (var msInput = new MemoryStream(input))
{
msInput.Position = 0;
using (
var cs = new CryptoStream(msInput, encryptor.CreateDecryptor(),
CryptoStreamMode.Read))
using (var data = new BinaryReader(cs))
using (var outStream = new MemoryStream())
{
byte[] buf = new byte[2048];
int bytes = 0;
while ((bytes = data.Read(buf, 0, buf.Length)) != 0)
outStream.Write(buf, 0, bytes);
return outStream.ToArray();
}
}
}
}
catch (Exception ex)
{
throw ex;
}
}
Does anyone have any clue as to why this could possibly be happening? For reference, this is the .NET method from Microsoft's reference source .sln that (I think) does the decryption: https://gist.github.com/Metaluim/fcf9a4f1012fdeb2a44f#file-rijndaelmanagedtransform-cs
EVP_CIPHER_iv_length()andEVP_CIPHER_key_length()(Mostly to ensure that you aren't using data past either the key or IV arrays). I'd also suggest checking using trivial values (e.g key and IV set to all 0, or all ~0) - Hasturkunopenssl enc. Also, you might as well use it to encrypt and check if you're getting the same results. (btw, my suspicion is that either the Windows or OS X build are using uninitialized data as part of the IV or key, leading to differing results. Hence my suggestions to check the key, the IV, and their lengths.) - HasturkunoutLenfails, PKCS#7 always pads, so you need to add 1 to 16 bytes to the size, not 0 to 15, i.e. it should always bebuflen + blocklen - remainder- Maarten Bodewes