Alpha Channel creates color banding???

Hi All,

I was wondering if someone could explain this.

I have an image that looks fine when saved as a 24bit (RGB) texture (.tga format) but when I add an alpha channel and save it as a 32bit image the resulting texture suffers from banding (I think that is the right term).

It looks like I am converting a 24bit picture to 8bit without dithering.

Any reason why this might be happening and what I could do to fix it.

As always, thanks for any replies.
jpummill

Does your TGA loading code correctly check for the Alpha channel?

Not sure about correctly.

It is the tgaload.cpp file that can be found in Nehe’s texture loading tutorials.

I have also specified what I think are the proper flags to the tgaload function:
tgaLoad (“alpha512a.tga”, &temp_image, TGA_ALPHA );

I just took a brief look at that TGA loading code and it looks alright. And I would assume that if Nehe had a bug in his TGA loading code, it would be found pretty quickly.

When you are giving OpenGL the pixel data, do you tell it that it’s RGBA as opposed to RGB?

Also, do you have blending or alpha testing enabled?

It could be two things.

  1. You don’t have sufficient bits of alpha and the alpha is causing the banding not the color.

or

  1. Going from an RGB to an RGBA internal texture format has decreased the color bits used for your texture, for example you may now be using an RGBA 4444 texture where before you were using an RGB 888 or RGB 565.

It’s not clear which of these has happened because you’re a little bit short on details.

I appreciate the replies. If anyone would like a look at my code here it is (minus the tgaload.cpp from nehe’s tutorials).

//---------------------
//-- Include Files –
//---------------------
#include <windows.h>
#include <GL/glut.h>
#include “tgaload.h”

#define MAX_NO_TEXTURES 2
#define TEXTURE_0 0
#define TEXTURE_1 1

//------------------------
//-- Global Variables –
//------------------------
GLuint texture_id[MAX_NO_TEXTURES];

//---------------------------
//-- Function Prototypes –
//---------------------------
void programInit(void);
void display(void);

//---------------------
//-- Main Function –
//---------------------
void main(int argc, char** argv)
{
glutInit ( &argc, argv);
glutInitDisplayMode ( GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA );
glutInitWindowPosition ( 10, 10);
glutInitWindowSize ( 512, 512);
glutCreateWindow ( “Glut Test Window”);

programInit            ( );

glutDisplayFunc        (display);
glutIdleFunc           (display);

glutMainLoop           ( );

}

//-----------------------------
//-- Program Init Function –
//-----------------------------
void programInit(void)
{
glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.5f);

glEnable(GL_COLOR_MATERIAL);
glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE);

glColor4f(1.0, 1.0, 1.0, 0.7);  // -- DEBUG

glEnable(GL_TEXTURE_2D);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, texture_id);

image_t temp_image;

glBindTexture(GL_TEXTURE_2D, texture_id[TEXTURE_0]);
tgaLoad ("alpha512a.tga", &temp_image, TGA_ALPHA );

glBindTexture(GL_TEXTURE_2D, texture_id[TEXTURE_1]);
tgaLoad ("alpha512b.tga", &temp_image, TGA_ALPHA );

// glEnable(GL_CULL_FACE);
// glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);

}

//-----------------------------
//-- Glut Display Function –
//-----------------------------
void display(void)
{
static int rot = 0;

glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glLoadIdentity();


glPushMatrix();
glRotatef(rot, 0.0f, 0.0f, 1.0f);
glBindTexture(GL_TEXTURE_2D, texture_id[TEXTURE_0]);
glBegin(GL_TRIANGLES);
    glTexCoord2f(0.0f, 0.0f); glVertex3f(-1.0, -1.0, 0.0);
    glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0, -1.0, 0.0);
    glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0,  1.0, 0.0);
    glTexCoord2f(1.0f, 1.0f); glVertex3f( 1.0,  1.0, 0.0);
    glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0, -1.0, 0.0);
    glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0,  1.0, 0.0);
glEnd();
glPopMatrix();


glPushMatrix();
glRotatef(-rot, 0.0f, 0.0f, 1.0f);
glBindTexture(GL_TEXTURE_2D, texture_id[TEXTURE_1]);
glBegin(GL_TRIANGLES);
    glTexCoord2f(1.0f, 1.0f); glVertex3f( 1.0,  1.0, 0.0);
    glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0, -1.0, 0.0);
    glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0,  1.0, 0.0);
    glTexCoord2f(0.0f, 0.0f); glVertex3f(-1.0, -1.0, 0.0);
    glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0, -1.0, 0.0);
    glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0,  1.0, 0.0);
glEnd();
glPopMatrix();

glutSwapBuffers();

}

Anyway, thanks again for all the help,
jpummill

Try comment out the enabling of GL_BLEND and see what the texture looks like. If it looks normal that way, then you know that the texture is loading correctly and that your blending is causing the banding.

You also have this line:

glColor4f(1.0, 1.0, 1.0, 0.7); // – DEBUG

You set the alpha at 0.7, which is going to modulate with the texture since you don’t change the glTexEnv mode anywhere. That’s going to happen in addition to whatever alpha channel you have in your TGA.

Thanks for the reply Deiussum.

I commented out the glEnable(GL_BLEND) line and also removed the glColor4f line.

I still have the banding problem. I really have no clue why this is happening.

If I posted my code to my web page tonight, would anyone be willing to download it and have a look?

jpummill

I’d be willing to have a look, but I’m not sure when I’ll get around to it.

Hi All,

I am posting a link to my source and executable files. They are saved in .zip format.

If anyone is has time to look at this I would sure appreciate some feedback.

Again, the problem is than I am seeing banding in my colors even though my .tga file contains 32 bit color, and my desktop is set to 32 bit depth.

Here is the link: http://home.attbi.com/~jpummill/projects/gltx/gltx.zip

Thanks again for any replies,
jpummill

Thanks for the replies guys but I think I just found my problem.

Once again, it is with crappy ATI video drivers.

On my Nvidia Geforce card, the application works as it should.

Thanks again for the help though,
jpummill

well this topic is old but anyways…
that is a problem of the ATI drivers I believe thats true.I have a radeon gfx card too and it was showing the 32bit tgas correctly but after i installed the latest catalyst drivers everything looked the same as your problem.
And when i turned back to the older drivers the problem was still there…even after i formatted the computer.
I think the crappy drivers changed something on the gfx card itself.
I have no clue to fix it.

yes yes the problem is with the ati drivers
3.7 i uninstalled them and installed 3.5 both the display driver and the control panel.then from the control panel increased the quality of OGL textures
tadaaa it works as good old days :]

dont blame ATI for the fact that you dont use the internalformat parmeter. if you set it to GL RGBA8, instead of just RGBA, or a 4 it will be correct