3
votes

I can send color to shader as 4 floats - no problem. However I want to send it as integer (or unsigned integer, doesn't really matter, what matters is 32 bits) and be decomposed in vec4 on shader.

I'm using OpenTK as C# wrapper for OpenGL (although it should be pretty much just a direct wrapper).

Let's consider one of the most simple shaders with vertex containing position(xyz) and color(rgba).

Vertex shader:

#version 150 core

in vec3 in_position;
in vec4 in_color;
out vec4 pass_color;
uniform mat4 u_WorldViewProj;

void main()
{
    gl_Position = vec4(in_position, 1.0f) * u_WorldViewProj;
    pass_color = in_color;
}

Fragment shader:

#version 150 core

in vec4 pass_color;
out vec4 out_color;

void main()
{
    out_color = pass_color;
}

Let's create vertex buffer:

public static int CreateVertexBufferColor(int attributeIndex, int[] rawData)
{
    var bufferIndex = GL.GenBuffer();
    GL.BindBuffer(BufferTarget.ArrayBuffer, bufferIndex);
    GL.BufferData(BufferTarget.ArrayBuffer, sizeof(int) * rawData.Length, rawData, BufferUsageHint.StaticDraw);
    GL.VertexAttribIPointer(attributeIndex, 4, VertexAttribIntegerType.UnsignedByte, 0, rawData);
    GL.EnableVertexAttribArray(attributeIndex);
    GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
    return bufferIndex;
}

And I'm getting all zeros for vec4 'in_color' in vertex shader. Not sure what's wrong.

Closest thing what I found: https://www.opengl.org/discussion_boards/showthread.php/198690-I-cannot-send-RGBA-color-as-unsigned-int .

Also in VertexAttribIPointer I'm passing 0 as a stride, because I do have VertexBufferArray and keep data separated. So colors come tightly packed 32 bits (per color) per vertex.

2
This is GLSL and not HLSL. The type has to be ivec4 for an integral datatypeRabbid76
I don't think that integer attributes are what you want. You want just UNORM, hence you use in vec4 in conjunction with glVertexAttribPointer. Not glVertexAttribIPointer.derhass

2 Answers

1
votes

You have to use VertexAttribPointer and not VertexAttribIPointer when the input in your shader are floats (vec4).

Set the normalized parameter to GL_TRUE.

Spec says:

glVertexAttribPointer, if normalized is set to GL_TRUE, it indicates that values stored in an integer format are to be mapped to the range [-1,1] (for signed values) or [0,1] (for unsigned values) when they are accessed and converted to floating point. Otherwise, values will be converted to floats directly without normalization.

1
votes

Aight, so this did the job for me:

Having managed data as int[] where it's tightly packed array of only colors (where int format is: RGBA meaning 0xAABBGGRR), then defining vertex attribute as: GL.VertexAttribPointer(index, 4, VertexAttribPointerType.UnsignedByte, true, sizeof(int), IntPtr.Zero) and using it in shader as: in vec4 in_color;.