View Full Version : Using glTexImage2D with GL_ALPHA

05-11-2012, 02:16 AM
I am trying to use GL_ALPHA as internal format in glTexImage2D.
My code goes like follows :

GLubyte data[4] ={255,16,8,64};

glGenTextures(1, &textureId);
glActiveTexture ( GL_TEXTURE0 );
glBindTexture(GL_TEXTURE_2D, textureId);

glTexImage2D ( GL_TEXTURE_2D, 0, GL_ALPHA, width, height, 0, GL_ALPHA, GL_UNSIGNED_BYTE, pixels );

Here I am a bit confused about width and height values .
When I am assigning (2 ,2) I am not getting expected output , because glReadPixels() doesn't return any pixel with alpha value 8 or 64.

Am I doing something wrong ?

05-17-2012, 03:18 AM
Issue resolved