Drawing a simple square in webgl

Hi all,

I have a very basic question. I am new to webgl and trying to draw a simple square. I am using the gl matrix library for matrix manipulation.

Javascript Code:


   squareVertexPositionBuffer = gl.createBuffer();
        gl.bindBuffer(gl.ARRAY_BUFFER, squareVertexPositionBuffer);
        vertices = [
             0.9,  0.9,  0.0,1.0,
            -0.9,  0.9,  0.0,1.0,
             0.9, -0.9,  0.0,1.0,
            -0.9, -0.9,  0.0,1.0

        ];


        squareVertexPositionBuffer.itemSize = 4;
        squareVertexPositionBuffer.numItems = 4;

        mat4.identity(pMatrix);
        mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);
        mat4.identity(mvMatrix);
        mat4.translate(mvMatrix, [-1.5, 0.0, -7.0]);

        gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
        gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, squareVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);
        setMatrixUniforms();
        gl.drawArrays(gl.TRIANGLE_STRIP, 0, squareVertexPositionBuffer.numItems);



shader code :


   attribute vec3 aVertexPosition;
     
    uniform mat4 uMVMatrix;
    uniform mat4 uPMatrix;

    varying vec3 debug;

        
    void main(void) {
        gl_Position =  uPMatrix * uMVMatrix * vec4(aVertexPosition.xyz, 1.0);
        debug = aVertexPosition;
    }


This seems to work out fine.Here i am passing the model view and perspective matrices as uniforms to the shader programs and multiplying them with the vertex coordinates there. But if multiply the model view and perspective matrices in the javascript and the then pass the modified vertices to the shader, it doesnt seem to work.


      mat4.multiply( mvMatrix,pMatrix,mvMatrix  );
      mat4.multiply(mvMatrix,vertices,vertices  );


Shader:



  
    void main(void) {
        gl_Position =  vec4(aVertexPosition.xyz, 1.0);
        debug = aVertexPosition;
    }



I’m not able to spot the mistake. Help highly appreciated!

gl_Position =  uPMatrix * uMVMatrix * vec4(aVertexPosition.xyz, 1.0);

This multiplies the MVP matrix, a 4x4 matrix, with the vertex positon, a vec4, transforming that position.

mat4.multiply( mvMatrix,pMatrix,mvMatrix  );
      mat4.multiply(mvMatrix,vertices,vertices  );

This multiplies the MVP matrix, a 4x4 matrix, with an array of 16 floats, which probably doesn’t throw an error because the internal reperesentation of a matrix in the glmatrix library is an array. But the result of this is something completely different than what the shader does.

@ralph

the vertex shader takes one vertex at a time, thats why its multiplies using vec4. In javascript we are doing the same multiplication for all the vertices, thats why the 4X4 matrix.

Even if you use a vec4 in javascript, still the results of the two are completely different, although they should be identical

n/m

my question is @ralph why is what glmatrix does something completely different from what shader does?