What's wrong with my texture

It does not render with texture. The triangle and the sphere are rendered with the last-used blue color.

What’s wrong with it? Thanks a lot!

On init:


......
FIBITMAP* tmp = FreeImage_Load(FIF_JPEG,"gstar.jpg",0);
FIBITMAP* img = FreeImage_ConvertTo32Bits(tmp);
FreeImage_Unload(tmp);
	
glGenTextures( 1, &texture );
glBindTexture(GL_TEXTURE_2D,texture);
glTexImage2D(
	GL_TEXTURE_2D, 0, GL_RGBA,
	FreeImage_GetWidth(img), FreeImage_GetHeight(img),
	0, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8,
	FreeImage_GetBits(img) );
printf("generated texture: %d
",texture);
......

On drawing:


......
// draw axis
glBegin(GL_LINES);
	glColor3f(1,0,0);
	glVertex3d(0,0,0);
	glVertex3d(100,0,0);
		
	glColor3f(0,1,0);
	glVertex3d(0,0,0);
	glVertex3d(0,100,0);
		
	glColor3f(0,0,1);
	glVertex3d(0,0,0);
	glVertex3d(0,0,100);
glEnd();
	
// textured triangle
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,texture);
glTexEnvf(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_REPLACE);
glBegin(GL_TRIANGLES);
	glNormal3d(1,1,1);
	
	glTexCoord2f(0,0);
	glVertex3d(5,0,0);
		
	glTexCoord2f(1,0);
	glVertex3d(0,5,0);
	
	glTexCoord2f(0,1);
	glVertex3d(0,0,5);
glEnd();
glDisable(GL_TEXTURE_2D);

glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,texture);
gluSphere(quad,2,32,32);
glDisable(GL_TEXTURE_2D);

......

GL_UNSIGNED_INT_8_8_8_8? You probably meant GL_UNSIGNED_BYTE. Anyways, why call glTexEnv? If you remove it, your texture should show up.

[quote=“DarkGKnight”]

GL_UNSIGNED_INT_8_8_8_8? You probably meant GL_UNSIGNED_BYTE. [/QUOTE]
I’ve tried GL_UNSIGNED_BYTE. It don’t works either.

What about removing glTexEnv?

Have you loaded mipmaps for each level, or alternatively disabled mipmaps for the texture? Since according to the spec:

On my system, the triangle is textured, but the sphere is not. To enable texturing of the sphere, gluQuadricTexture must be called.
Thus, at minimum, your triangle should be textured. Therefore, you are most likely experiencing a driver bug. Are you testing this code on Intel GPU hardware?

Use gluBuild2DMipmaps instead of glTexImage2D if you just start with GL. This will avoid unexpected unrendered textured.

My platform is: Fedora 14, with Nvidia 8XXXM chip, using the driver from rpmfusion, version 1:260.19.36-1.fc14, using the libGL.so provided by Nvidia driver.

Moreover, I did not mention that the quadric is actually enabled texturing:


int main(int argc, char **argv) {
	srand48(1);
	
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_RGB|GLUT_DOUBLE|GLUT_DEPTH);
	glutInitWindowSize(400,400);
	glutInitWindowPosition(100,100);
	glutCreateWindow("run");
	
	glutDisplayFunc(on_display);
	glutMouseFunc(on_mouse);
	glutMotionFunc(on_motion);
	glutSpecialFunc(on_keyboard);
	init();
	
	// init global objects
	quad = gluNewQuadric();
	gluQuadricOrientation(quad,GLU_OUTSIDE);
	gluQuadricDrawStyle(quad,GLU_FILL);
	gluQuadricNormals(quad,GLU_SMOOTH);
	gluQuadricTexture(quad,GL_TRUE);
	
	glutMainLoop();
	exit(0);
}

It works!! Thanks a lot!!!
So, the system don’t happy if I don’t provide it the full series of mipmapping?

If this is your only texture loading code:

glGenTextures( 1, &texture );
glBindTexture(GL_TEXTURE_2D,texture);
glTexImage2D(
GL_TEXTURE_2D, 0, GL_RGBA,
FreeImage_GetWidth(img), FreeImage_GetHeight(img),
0, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8,
FreeImage_GetBits(img) );
printf("generated texture: %d
",texture);

Then you aren’t filling all the levels of the texture, so your texture won’t be complete, and binding the incomplete texture to a texture unit will cause texturing for the texture unit you bind that texture to be disabled.

You have several options:

  1. You can either generate the required levels from the level 0 image, using glGenerateMipmap(GL_TEXTURE_2D) (recommended) or gluBuild2DMipmap or glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE).

  2. You can load each level of the texture manually, eg. for an 8x8 texture image, you would also need to load 4x4, 2x2 and 1x1 images too:
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 8, 8, 0, GL_RGBA, GL_UNSIGNED_BYTE, img_level[0]);
    glTexImage2D(GL_TEXTURE_2D, 1, GL_RGBA, 4, 4, 0, GL_RGBA, GL_UNSIGNED_BYTE, img_level[1]);
    glTexImage2D(GL_TEXTURE_2D, 2, GL_RGBA, 2, 2, 0, GL_RGBA, GL_UNSIGNED_BYTE, img_level[2]);
    glTexImage2D(GL_TEXTURE_2D, 3, GL_RGBA, 1, 1, 0, GL_RGBA, GL_UNSIGNED_BYTE, img_level[3]);

  3. You can disable mipmaps (not recommended except in certain situations):
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

It works!! Thanks a lot!!!
So, the system don’t happy if I don’t provide it the full series of mipmapping? [/QUOTE]

Yes, the initial value for a texture objects minification filter is GL_NEAREST_MIPMAP_LINEAR, so you need to provide all levels, or change this value. I assume they choose this value because it’s the most common value used, but it often catches people out.

The size of each mipmap level reduces in half in each direction, until you reach a 1x1 texture.

Eg. with a 64x8 texture, you would have:
0: 64x8
1: 32x4
2: 16x2
3: 8x1
4: 4x1
5: 2x1
6: 1x1

Whilst debugging the code I was using (a modified Nehe Lesson 6), I noticed that the texture upload function was using gluBuild2DMipmaps internally (which isn’t recommended anymore). The OpenGL Wiki has a section about the common mistakes that users make when creating textures.