shader storage block length returns zero or less

I am getting zero or less when I try to get the length of an indeterminate-length array in a shader storage block.

Setting up the storage (Java/LWJGL):


geometryBuffer = ByteBuffer.allocateDirect(12* 4)
    .order(ByteOrder.nativeOrder());

geometryBuffer.putFloat(1f);
geometryBuffer.putFloat(0.25f);
geometryBuffer.putFloat(0.5f);
geometryBuffer.putFloat(0.75f);
geometryBuffer.putFloat(1.1f);
geometryBuffer.putFloat(0.35f);
geometryBuffer.putFloat(0.6f);
geometryBuffer.putFloat(0.85f);
geometryBuffer.putFloat(1.2f);
geometryBuffer.putFloat(0.45f);
geometryBuffer.putFloat(0.7f);
geometryBuffer.putFloat(0.95f);

geometryBuffer.flip();

geometryBufferId = GL15.glGenBuffers();
GL15.glBindBuffer(GL43.GL_SHADER_STORAGE_BUFFER, geometryBufferId);
System.out.println("bb" + GL11.glGetError());
GL15.glBufferData(GL43.GL_SHADER_STORAGE_BUFFER, geometryBuffer, GL15.GL_STATIC_DRAW);
System.out.println("bd" + GL11.glGetError());
GL30.glBindBufferRange(GL43.GL_SHADER_STORAGE_BUFFER, 0, geometryBufferId, 0, 36);
System.out.println("br" + GL11.glGetError());
GL15.glBindBuffer(GL43.GL_SHADER_STORAGE_BUFFER, 0);

The frag shader:


#version 430

out vec4 outColour;

layout(std430, binding=0) buffer Geometry {
    vec4 a;
    vec4 myVec[];
} ; 

void main() {
    // if (myVec[0].y == 0.35) {
    if (myVec.length() == -1) {
        outColour = vec4 (1,0,0,1);
    } else {
        outColour = vec4(0,0,1,1);
    }
}

I expect length to be 2 but the length returned in this example is -1. I can retrieve the values in the array as the test …

 if (myVec[0].y == 0.35)

… also returns red.

According to Interface Block (GLSL) - OpenGL Wiki and the GLSL spec v4.5 section 4.1.9, the size is calculated at runtime from the size of the buffer or range, minus all the sized components in the block and then divided by the size of an individual array entry. What I seem to be getting is zero minus the number of vec4 units before the indeterminate storage in the block.

Why doesn’t length() return the value I expect? I’m running on a GTX660M, Windows 10

[QUOTE=svcop3;1283442]
According to Interface Block (GLSL) - OpenGL Wiki and the GLSL spec v4.5 section 4.1.9, the size is calculated at runtime from the size of the buffer or range, minus all the sized components in the block and then divided by the size of an individual array entry. What I seem to be getting is zero minus the number of vec4 units before the indeterminate storage in the block.[/QUOTE]
So the result is as if the buffer had zero length?

Is there a buffer bound to that block? Is the block’s data (not just myVec.length()) being used by the shader?

That was the problem. When I change to this …


    if (myVec.length() == 2) {
        outColour = vec4 ([b]myVec[0].y[/b],0,0,1);
    } else {
        outColour = vec4(0,0,1,1);
 }

… it show the correct colour because I am actually referencing the array and the compiler isn’t optimizing it out.

Problem solved. Thanks so much for your help!