I'm probably very wrong here so please point out any misconceptions or mistakes I may have.
The input for this assignment is a string of some characters followed by a series of 4-byte unsigned integers in little-endian form. I read the input into an STL string, and used the substring function to isolate just the integers into a new string.
From what I understand, the least significant bits will be stored starting from the left most byte. An unsigned integer 131071(dec) whose binary representation is (0000 0000 0000 0001 1111 1111 1111 1111) would then be stored as 1111 1111 1111 1111 0000 0001 0000 0000
Parsing this new string 4 bytes at a time, I thought the first byte would contain the bits for values 0-255, 2nd byte for 256-65535 and so forth for the 3rd and 4th bytes.
I took the 1st byte, casted it to an unsigned int, and then shifted it 24 bits, I did the same to the 2nd and 3rd bytes, except 16 and 8 bit shifts respectively.
I got 255 and 65535 for the 1st and 2nd bytes as expected, but 0 for the 3rd byte. I believe I have to use the & operator somewhere, but I can't figure out where. If anyone could give me some advice, or perhaps a less foolish way of accomplishing the same task, I would greatly appreciate it.