I have 2 shader programs - one for rendering sprites with textures and the second one to render polygons. I have enabled blending and Z-buffer like so:
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);
GLES20.glEnable( GLES20.GL_DEPTH_TEST );
GLES20.glDepthFunc( GLES20.GL_LEQUAL );
GLES20.glDepthMask( true );
GLES20.glDepthRangef(0, maxZDepth); //maxZDepth = 100f;
And my rendering consists of 2 rendering invocations (glDrawElements): one for sprites and right after it for polygons with adequate shader programs... The order of sending object's data (vertices etc...) to shader is sorted from the objects' lowest Z value to the highest and I also had to add such instruction to my sprites' shader:
if(gl_FragColor.a == 0.0)
discard;
Now, the blending and Z-buffer are working properly but only in the scope of one shader at a time. The blending of the objects drawn by first shader doesn't seem to be relevant for the second shader... Here's an example:

The sprite here has higher Z value than the brown polygon beneath it and that's why it's drawn on the polygon but blending fails and you can see the grey background (created by glClearColor) showing around the sprite...
Does anybody know some good solution to this problem? I thought about combining 2 shader programs into one and then there would be only 1 rendering invocation which I hope would solve it but I'd prefer to preserve 2 separate shader programs for sprites and polygons...