I have the following java code:
public static void main(String []args){
int a = 1;
byte b = -14;
int c = (a << 8) | b;
System.out.println(c);
}
Which produces:
-14
I would have expected 498. As I understand things, the OR operation will look like this after shifting: 1 0000 0000 | 1111 0010, and I would expect this to result in an int that looks like 0000 0001 1111 0010, for a value of 498. Clearly Java is somehow sign extending the bytes when it OR's them into int c. Can anyone show me the correct way to do this?
What I'm trying to do is make an int where the least significant byte is b and the 2nd least significant byte is a.
int c = ((a << 8) | ~(~0 << 8) & b);
. This will give you 498. – chauhraj