Texture and lighting + class structure.

Edit: Sorted some things out but now I have strange lighting effect where only one planet is properly lit (light on one side dark on the other) and the rest are fully lit. Odd.

I have a fairly large solar system simulator project (done in Qt for UI) but I can’t quite get texture and lighting to work and I think it’s mainly because my class structure has become a bit of a mess and I’m not sure what’s happening when. I’m sure I’ll get stick for my class structure but this is just the way it’s worked out.

Here’s the simple class breakdown.

There’s a SpaceObject base class which has derived classes of Planets, Stars, etc…
There’s a SpaceSystem class in which the solar system and all integrators and textures are initialized.
There’s an OpenGL class (derived from QGLWidget - a Qt OpenGL class).

Each SpaceObject has its own draw function which is declared as virtual in the SpaceObject class as certain objects (stars, satellites) will require different rendering instructions.

SpaceSystem has a draw function which draws the skybox background, a faint grid, position markers for the objects and calls the draw function for each SpaceObject.

OpenGL has a paint function (as all QGLWidgets do) which sets up two perspective calls so that far objects and near objects can be rendered without having a tiny zNear and huge zFar (using actual solar system scales). It then calls the SpaceSystem draw function.

I’ve become quite confused as to whether certain material and light calls are still active at any time and I’m not sure when all these need to actually be called and how often! I should say that I’m a maths guy and this is more of a maths exercise than anything and I don’t really like graphics programming which is why there will likely be a lot of mistakes in the codes below…

The paint function in the OpenGL class


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    yaw   += 4.0*yawTemp;
    pitch += 4.0*pitchTemp;

    if (yaw >= 360.0f) { yaw -= 360.0f; }
    else if (yaw < 0.0f) { yaw += 360.0f; }
    if (pitch >= 360.0f) { pitch -= 360.0f; }
    else if (pitch < 0.0f) { pitch += 360.0f; }

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();

    gluPerspective(45.0f, (GLfloat)width/(GLfloat)height, 0.1f, 120.0f);

    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();

    gluLookAt(5.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,-1.0);

    solarSystem->draw(yaw, pitch, xMove, yMove, zMove, markersOn);

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();

    gluPerspective(45.0f, (GLfloat)width/(GLfloat)height, 0.0001f, 0.1f);

    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();

    gluLookAt(5.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,-1.0);

    solarSystem->draw(yaw, pitch, xMove, yMove, zMove);

The draw function in SpaceSystem


glEnable(GL_TEXTURE_2D);
    glEnable(GL_LIGHTING);
    glDisable(GL_DEPTH_TEST);

    drawBackground(yaw, pitch);

    glDisable(GL_TEXTURE_2D);
    glDisable(GL_LIGHTING);
    glEnable(GL_DEPTH_TEST);

    if (drawMarkers)
    {
        drawPositionMarkers(yaw, pitch, xMove, yMove, zMove);
    }

    drawGrid(yaw, pitch, xMove, yMove, zMove);

    glEnable(GL_TEXTURE_2D);

    drawObjects(yaw, pitch, xMove, yMove, zMove);

The drawObjects function calls the draw functions in all the SpaceObjects and also binds textures.
The draw function in the SpaceObject class uses the icosahedron algo to draw spheres, deals with whether it is actually visible and draws an orbital trail.

I have a feeling this whole thing needs a total overhaul and has just become function programming inside classes as it’s starting to look awful from an outside perspective.

What I’d like to know is, where do I put the light call and the material calls? The light call is currently inside the drawObjects function which draws the sun with lighting off then turns it on to draw everything else using the suns position. The material calls are inside the SpaceObjects class which has a setUpGL function which sets up material properties for each object type (e.g. sun will of course be difference from planets and moons). It gets called when the SpaceObject is initialized and it is…


float mColour[] = {R, G, Blue, 1.0f};
    glMaterialfv(GL_FRONT, GL_AMBIENT_AND_DIFFUSE, mColour);

    GLfloat mat_specular[] = {0.3f, 0.3f, 0.3f, 1.0};
    GLfloat mat_shininess[] = { 40.0 };
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, mat_shininess);

    glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE);

    glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);

Since I am disabling and enabling light and texture calls for different draw commands, do I need to recall all the material and light functions? I have a feeling I’ll get a link to a beginner guide with how to do this basic stuff but really that won’t help because I’ve never found a guide yet that deals with this sort of thing on individual class objects which all have slightly different properties. It’s always drawing a single textured object with lighting in the one class. I can do this no bother but all this classes stuff has made it hellishly confusing.

(FYI the rest of the classes, integrator, UI etc, are all set up nicely, I’ve just always found it difficult to set up classes when dealing with graphics)

I’ve just always found it difficult to set up classes when dealing with graphics

yes, probably because the state you need and properties required are most likely outside the class performing the drawing.
That’s the difference between a contrived demo and something more structured and generic (aka an engine).

What I’d like to know is, where do I put the light call and the material calls?

All I can say is that fixed function lighting is dependant upon the current modelview matrix when calling the lightv* commands.
If you are drawing a planet try to create a separate model matrix for each, independant from each other and the camera.
So, when rendering you could do something like this:

glloadidentity;
gluLookat (ex,ey,ez, cx,cy,cz, 0,1,0);
for each planet do
{
glpushmatrix;
glMultMatrix (planet1_Modelmatrix);
render(planet1)
glpopmatrix;
}

Where renderplanet would set the material properties and call glvertex3f, etc.
Each frame when rotating and moving the planets, you would need to build the modelmatrx (containing the scale, rotation and position).

Ok I’ll give that a look. This was sort of how I was currently doing it.


glEnable(GL_LIGHTING);

    glPushMatrix();

        glTranslatef(5.0,0.0,0.0);
        glRotatef(pitch, 0.0f, -1.0f, 0.0f);
        glRotatef(yaw, 0.0f, 0.0f, -1.0f);
        glTranslatef(-5.0,0.0,0.0);

        glTranslatef(xMove,yMove,zMove);

        glRotatef(180.0,1.0,0.0,0.0);

        int textureIndex = 10;

        for (unsigned int i = 1; i < spaceObjects.size(); i++)
        {
            glBindTexture(GL_TEXTURE_2D, textures[i]);
            spaceObjects[i]->draw(yaw, pitch, xMove, yMove, zMove);
            for (unsigned int j = 0; j < spaceObjects[i]->numberSecondary; j++)
            {
                glBindTexture(GL_TEXTURE_2D, textures[textureIndex]);
                textureIndex += 1;
                spaceObjects[i]->secondary[j]->draw(yaw, pitch, xMove, yMove, zMove);
            }
        }

    glPopMatrix();

Now when I add in a light (anywhere, absolutely anywhere including inside the spaceObject rendering function) it only lights the first planet. Gonna have to restructure everything I guess.

Edit: In your example, where would you set up the light? In each render(Planet) function?!?

And I have no idea what you mean by planets current model matrix

it only lights the first planet

A light has an attenuation factor.
may be the other planets are too far away, or the attenuation is not large enough.

    glTranslatef(5.0,0.0,0.0);
    glRotatef(pitch, 0.0f, -1.0f, 0.0f);
    glRotatef(yaw, 0.0f, 0.0f, -1.0f);
    glTranslatef(-5.0,0.0,0.0);
    glTranslatef(xMove,yMove,zMove);
    glRotatef(180.0,1.0,0.0,0.0);

What are all these rotations doing?
There are no descriptions next to each so impossible to tell what’s gong on.

planets current model matrix

A model matrix contains is the matrx which affects the drawing of the model. Each time you issue gltranslatef or glscalef you are creating a matrix - which gl multples against the current modelview matrx. I’m saying you can pre-compuet the model matrix for each planet after you move it in its orbit.

I thought it might be that (light too far away) but I tried starting the drawing index a planet later and it’s the first planet each time regardless of distance, there’s definitely more going on. Some other planets darken randomly as well, I have no idea why it’s happening.

I figured the rotations and that weren’t important to the problem but they shift the whole scene around. I start with setting up the ‘camera’ using gluLookAt and then I can travel around the solar system using the keyboard, xMove, yMove and zMove simply move the universe depending on how the camera moves. Yaw and pitch rotate the universe depending on where I move the mouse. The rotate 180 is because the yaxis and zaxis were the wrong way round so I flipped the whole scene.

That all works perfectly fine, everything does until I try to add light. The annoying thing is I remember a few days ago the light was working fine and I have no idea what stopped it as I guess I never noticed.

I know it has a lot to do with where my material calls and that sort of stuff is but surely it would fail to work for all planets? If the first one works then I would imagine the rest would unless there’s something in SpaceObjects->draw(…) that’s causing the light to turn off… I don’t know what’s happening, it’s bizarre.

Starting to think it has something to do with Qt tho. I get crazy behaviour from it sometimes where it doesn’t compile new code until you restart it.

I’d start by renabling all lighting state before rendering each planet incase light has been disabled.

Ah found a stray disable hidden away in a derived classes draw function debug test.