2
votes

Criteria: I’m using OpenGL with shaders (GLSL) and trying to stay with modern techniques (e.g., trying to stay away from deprecated concepts).

My questions, in a very general sense--see below for more detail—are as follows:

  1. Do shaders allow you to do custom blending that help eliminate z-order transparency issues found when using GL_BLEND?
  2. Is there a way for a shader to know what type of primitive is being drawn without “manually” passing it some sort of flag?
  3. Is there a way for a shader to “ignore” or “discard” a vertex (especially when drawing points)?

Background: My application draws points connected with lines in an ortho projection (vertices have varying depth in the projection). I’ve only recently started using shaders in the project (trying to get away from deprecated concepts). I understand that standard blending has ordering issues with alpha testing and depth testing: basically, if a “translucent” pixel at a higher z level is drawn first (thus blending with whatever colors were already drawn to that pixel at a lower z level), and an opaque object is then drawn at that pixel but at a lower z level, depth testing prevents changing the pixel that was already drawn for the “higher” z level, thus causing blending issues. To overcome this, you need to draw opaque items first, then translucent items in ascending z order. My gut feeling is that shaders wouldn’t provide an (efficient) way to change this behavior—am I wrong?

Further, for speed and convenience, I pass information for each vertex (along with a couple of uniform variables) to the shaders and they use the information to find a subset of the vertices that need special attention. Without doing a similar set of logic in the app itself (and slowing things down) I can’t know a priori what subset of vericies that is. Thus I send all vertices to the shader. However, when I draw “points” I’d like the shader to ignore all the vertices that aren’t in the subset it determines. I think I can get the effect by setting alpha to zero and using an alpha function in the GL context that will prevent drawing anything with alpha less than, say, 0.01. However, is there a better or more “correct” glsl way for a shader to say “just ignore this vertex”?

1
-1: For asking several, rather unrelated, questions as one single question.Nicol Bolas
I see your point, but they are all related to the same issue for me. I want to draw points and lines from a subset of vertices as opaque in one pass and then draw only translucent lines for the rest of the vertices in a second pass. I do this to overcome blending issue since I don’t know a better way (my first question). I want to send the same vertices each time (along with a couple of variables) and have the shader determine which vertices to draw and which to drop (3rd question). Finally, as it turns out, I only need the shader to drop verticies when drawing the points (2nd question).FTLPhysicsGuy
Not to press the point, but I've also noticed that asking a simple question often leads to the obvious follow up of "but why do you want to do that?" I didn't want to try to follow those sorts of questions across multiple threads when they all came back to the same overall desired behavior.FTLPhysicsGuy

1 Answers

7
votes

Do shaders allow you to do custom blending that help eliminate z-order transparency issues found when using GL_BLEND?

Sort of. If you have access to GL 4.x-class hardware (Radeon HD 5xxx or better, or GeForce 4xx or better), then you can perform order-independent transparency. Earlier versions have techniques like depth peeling, but they're quite expensive.

The GL 4.x-class version uses essentially a series of "linked lists" of transparent samples, which you do a full-screen pass to resolve into the final sample color. It's not free of course, but it isn't as expensive as other OIT methods. How expensive it would be for your case is uncertain; it is proportional to how many overlapping pixels you have.

You still have to draw opaque stuff first, and you have to draw transparent stuff using special shader code.

Is there a way for a shader to know what type of primitive is being drawn without “manually” passing it some sort of flag?

No.

Is there a way for a shader to “ignore” or “discard” a vertex (especially when drawing points)?

No in general, but yes for points. A Geometry shader can conditionally emit vertices, thus allowing you to discard any vertex for arbitrary reasons.

Discarding a vertex in non-point primitives is possible, but it will also affect the interpretation of that primitive. The reason it's simple for points is because a vertex is a primitive, while a vertex in a triangle isn't a whole primitive. You can discard lines, but discarding a vertex within a line is... of dubious value.

That being said, your explanation for why you want to do this is of dubious merit. You want to update vertex data with essentially a boolean value that says "do stuff with me" or not to. That means that, every frame, you have to modify your data to say which points should be rendered and which shouldn't.

The simplest and most efficient way to do this is to simply not render with them. That is, arrange your data so that the only thing on the GPU are the points you want to render. Thus, there's no need to do anything special at all. If you're going to be constantly updating your vertex data, then you're already condemned to dealing with streaming vertex data. So you may as well stream it in a way that makes rendering efficient.