1
votes

EDIT:

My question was unclear at first, I'll try to rephrase it:

How do I use different shaders to do different rendering operations on the same mesh polygons? For example, I want to add lighting using one shader and add fog using another shader. I need to use the color interpolated from the first shader in the calculation of the second shader, but I don't know how to do it if I can't (or rather not supposed to) pass around the color buffer between shaders. Also (and that was where my question started), I need the same world-view-projection calculations for both shaders, so am I supposed to calculate it in every shader seperatly? Am I supposed to use one big shader for all my rendering operations?

Original question:

Say I have two different shader programs. The first one calculates the vertex positions in the vertex shader and does some operations in the fragment shader.

Let's say I want to use the fragment shader to do different calculations, but I still want to use the same vertex positions calculated by the first vertex shader. Do I have to calculate the vertex positions again or is there a way to share state between different shader programs?

1
I want to use the same interpolated vertices positions in different fragment shaders in different shader programs without recalculating the vertex positions every time.asaf92
OK, and how about adding together color calculations from different fragment shaders? For example having one shader program to calculate some lighting and another shader program to calculate another type of lighting?asaf92
@PanthersFan92: All of your questions seem to stem from a fundamental misunderstanding of what shaders do. Shaders are part of the process of rendering. VS's compute the vertex data needed for rasterization. FS's compute the products of that rasterization, to be fed into post-processing stages (blending, etc). These shaders are invoked when you render something. They're a part of the rendering pipeline, and the data they receive and generate is used for rendering that particular something that you asked to render.Nicol Bolas
@PanthersFan92: And yet, what you just said shows that you didn't really understand what I just said. Shaders don't do "calculations" into "buffers"; they do rendering. If you want to use shaders to do "calculations", then you have to make calculations look like rendering. And therefore, you have to structure your operation to work within the confines of the rendering pipeline.Nicol Bolas
@PanthersFan92: OK, so what you're saying is that you want to render a thing, and you want to have different program objects be responsible for different aspects of that rendering (lighting, fog, etc). Is that what you're saying? If so, please put that in your question.Nicol Bolas

1 Answers

1
votes

you got more options:

  1. multi pass

    this one usually render the geometry into depth and "color" buffer first and then in next passes uses that as input textures for rendering single rectangle covering whole screen/view. Deferred shading is an example of this but there are many other implementations of effects that are not Deferred shading related. Here an example of multi pass:

    In first pass the planets and stars and stuff is rendered, in second the atmosphere is added.

    You can combine the passes either by blending or direct rendering. The direct rendering requires that you render to texture each pass and render in the last one. Blending is changing the color of the output in each pass.

  2. single pass

    what you describe is more like you should encode the different shaders as a functions for single fragment shader... Yes you can combine more shaders into single one if they are compatible and combine their results to final output color.

    Big shader is a performance hit but I think it would be still faster than having multiple passes doing the same.

    Take a look at this example:

    this one computes enviromental reflection, lighting, geometry color and combines them together to single output color.

  3. Exotic shaders

    There are also exotic shaders that go around the pipeline limitations like this one:

    Which are used for stuff that is believed to be not possible to implement in GL/GLSL pipeline. Anyway If the limitations are too binding you can still use compute shader...