determining surface normal

I am using the following C routine, adapted from the openGL wiki, to calculate surface normals:


void setnormal (vect P1, vect P2, vect P3) {  /* from openGL wiki pseudo-code */
        vect U,V;
/*      Set Vector U to (Triangle.p2 minus Triangle.p1)
        Set Vector V to (Triangle.p3 minus Triangle.p1)
        Set Normal.x to (multiply U.y by V.z) minus (multiply U.z by V.y)
        Set Normal.y to (multiply U.z by V.x) minus (multiply U.x by V.z)
        Set Normal.z to (multiply U.x by V.y) minus (multiply U.y by V.x) */
        vectminus(P2,P1,U);
        vectminus(P3,P1,V);
        normal[0]=(U[1]*V[2])-(U[2]*V[1]);
        normal[1]=(U[2]*V[0])-(U[0]*V[2]);
        normal[2]=(U[0]*V[1])-(U[1]*V[0]);
}

vectminus() just does the simple arithmetic. “normal” is a global float[3]. There is no scaling in the scene. The routine gets called this way:


setnormal(one,three,two); glNormal3fv(normal);

Two and three are reversed because the vectors are usually also being used as the corners of some polygon, and this seems to be the best arrangement. I have a scene where I can move both the camera and the single light, and the normals seem fine except the intensity of the light is out of proportion.

Here the light is about 85 degrees from vertical in relation to the origin, where the pyramid is (ie. it is about to “set”). But these are the colors and light intesity:


float darkorangered[3]={0.5f,0.1f,0.0f};
float ambient[4]={0.1f,0.1f,0.1f,1.0f};
float diffuse[4]={0.1f,0.1f,0.1f,1.0f};

So viewed without GL_LIGHTING,

Obviously, when facing the light the color values of the pyramid are being multiplied to excess, which also seems strange considering the very low intensity of the light. I presume this has something to do with the normals?

If anyone has a clue that would be great. Here’s the setup routine for the lighting:


void init(float R, float G, float B) {
//      glEnable(GL_CULL_FACE);
        glEnable(GL_DEPTH_TEST);
        glClearColor(R,G,B,1.0f);
        if (LIT) {
                glEnable(GL_LIGHTING);
                glEnable(GL_COLOR_MATERIAL);
                glColorMaterial(GL_FRONT,GL_AMBIENT_AND_DIFFUSE);
                glLightfv(GL_LIGHT0,GL_AMBIENT,ambient);
                glLightfv(GL_LIGHT0,GL_DIFFUSE,diffuse);        
                glEnable(GL_LIGHT0);
                arrangelamp(0.0f);
        }
}

arrangelamp() just involves the light coordinates.

Why are you using glColorMaterial? I’m not sure what it is supposed to do. I always use glMaterial when lighting is enabled.

All of the tutorials and both of the books I have access to use glColorMaterial() to enable color tracking, so you won’t have to specify a glMaterial for every surface.

However, it doesn’t make any difference if I comment that out – the line that seems to count is “glEnable(GL_COLOR_MATERIAL)”.

I don’t think this has anything to do with the problem, unfortunately. I also tried setting the GL_SPECULAR intesity of light0 to zero, that didn’t matter…

If you have a doubt about your normals, do glEnable(GL_NORMALIZE); so that you remove the doubt.
http://www.opengl.org/resources/faq/technical/lights.htm#ligh0090

When I first look your lit picture before reading the text, I had the impression that the specular was very strong. You may want to set explicitly the specular to black/zero and check if it has an effect.

EDIT: ok, so no specular. Are you completely positive that only light0 is enabled ?

Thanks much ZbuffeR – GL_NORMALIZE did the trick.

Although I don’t glScale anything, I now realize that in order to get a “unit length” surface normal, I have to use unit length vectors in setnormal() (I hadn’t bothered doing any math, I just implemented the algorithm). So went I went back and put unit length vectors in, took GL_NORMALIZE out and it works.

Thanks! Thanks! Thanks!

(ps. if you look back you can see the ground is actually shaded correctly because that used unit length vectors originally)