We have been trying a prototype of a scheme where we encrypt decrypt data between two systems: One in .NET and the other in Java. We were going to use simple 128 bit AES Encryption.
The problem I am facing is trivial, but I cannot find a proper solution. Maybe my understanding of AES or Encryption in general is less.
Assuming we have a predefined key, represented by the following hex string: "9c361fec3ac1ebe7b540487c9c25e24e". This is a 16-byte key. The encryption part in Java would be
final byte[] rawKey = hexStringToByteArray("9c361fec3ac1ebe7b540487c9c25e24e");
final SecretKeySpec skeySpec = new SecretKeySpec(rawKey, "AES");
// Instantiate the cipher
final Cipher cipher = Cipher.getInstance("AES");
cipher.init(Cipher.ENCRYPT_MODE, skeySpec);
final byte[] encrypted = cipher.doFinal(plainText.getBytes());
The 'hexStringToByteArray' function converts the hex string to a byte array. The problem is that in java, bytes are signed. So the value 9C is -100 and not 156 (as it would be in .NET).
In Java this becomes: -100,54,31,-20,58,-63,-21,-25,-75,64,72,124,-100,37,-30,78
In .NET however, this is: 156,54,31,236,58,193,235,231,181,64,72,124,156,37,226,78
Question: Given that the representation of the keys itself differs, would it affect the encryption process itself? This is simple encryption without CBC and PADDING.
Edit: Updated the code to look formatted.