Though GPUs are supposed for use with floating point data types, I'd be interested in how fast can GPU process bitwise operations. These are the fastest possible on CPU, but does GPU emulate bitwise operations or are they fully computed on hardware? I'm planning to use them inside shader programs written with GLSL. Also I'd suppose that if bitwise operations have full preformance, integer data types should have also, but I need confirmation on that.
To be more precise, targeted versions are OpenGL 3.2 and GLSL 1.5. Hardware that should run this is any Radeon HD graphics card and GeForce series 8 and newer.. If there are some major changes in newer versions of OpenGL and GLSL related to processing speeds of bitwise operations/integers, I'd be glad if you'll point them out.