Create a texture from a [OpenGL-type] bitmap

I’m trying to create a texture object in OpenGL from a bitmap. (And by bitmap, I mean a true one-bit-per-pixel OpenGL type bitmap, not that Microsoft abomination file type.)

I would think that I could use glTexImage2D like any other texture, but I can’t seem to get this to work. Below is a test program that I’m using. Perhaps you can see what I am doing wrong. The program displays a quad and should show the texture, but it currently only shows a solid red rectangle. Thanks for your help.

The following program uses SDL as the window manager. It should work in both Windows and Linux, but has only been tested in Linux.

#include <stdio.h>
#ifdef WIN32

include <windows.h>

#endif
#include <GL/gl.h>
#include <SDL/SDL.h>

#if 0
const unsigned char BITMAP_DATA = {
0x41, 0x00, 0x00, 0x00,
0x41, 0x00, 0x00, 0x00,
0x42, 0x00, 0x00, 0x00,
0x22, 0x00, 0x00, 0x00,
0x22, 0x00, 0x00, 0x00,
0x14, 0x00, 0x00, 0x00,
0x14, 0x00, 0x00, 0x00,
0x1C, 0x00, 0x00, 0x00,
0x0C, 0x00, 0x00, 0x00,
0x08, 0x00, 0x00, 0x00,
0x08, 0x00, 0x00, 0x00,
0x30, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00
};
#endif

//#if 0
const unsigned char BITMAP_DATA = {
0x41,
0x41,
0x42,
0x22,
0x22,
0x14,
0x14,
0x1C,
0x0C,
0x08,
0x08,
0x30,
0x00,
0x00,
0x00,
0x00
};
//#endif

const int BITMAP_PITCH = 1;
const int BITMAP_ROWS = 16;
const int BITMAP_COLS = 8* BITMAP_PITCH;
const int NUM_PIXELS = BITMAP_COLS * BITMAP_ROWS;

const int SCREEN_WIDTH = 900;
const int SCREEN_HEIGHT = 750;

int main () {

// Init SDL
SDL_Init (SDL_INIT_VIDEO);
SDL_Surface *screen = SDL_SetVideoMode (SCREEN_WIDTH, SCREEN_HEIGHT, 32, SDL_OPENGL | SDL_HWSURFACE | SDL_DOUBLEBUF);
if (screen == NULL){ return 0; }
SDL_WM_SetCaption (“Test Bitmap OpenGL Texture Program”, NULL);

// Init OpenGL
glViewport (0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);
glMatrixMode (GL_PROJECTION);
glLoadIdentity ();
glOrtho (0, SCREEN_WIDTH, 0, SCREEN_HEIGHT, -1, 1);
glMatrixMode (GL_MODELVIEW);

glClearColor (0.0, 0.0, 0.0, 0.0);
glClearDepth (1.0);
glEnable (GL_DEPTH_TEST);
glDepthFunc (GL_LEQUAL);
glPolygonMode (GL_FRONT, GL_FILL);
glPolygonMode (GL_BACK, GL_LINE);

glEnable (GL_TEXTURE_2D);
//glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

// Create texture
unsigned int texObj;
glGenTextures (1, &texObj);

if (texObj == 0){
fprintf (stderr, "Unable to create an OpenGL texture object
");
SDL_Quit();
return 1;
}

glBindTexture (GL_TEXTURE_2D, texObj);
glTexImage2D (
/target/ GL_TEXTURE_2D,
/level/ 0,
/internalFormat/ GL_INTENSITY,
/width/ BITMAP_COLS,
/height/ BITMAP_ROWS,
/border/ 0,
/format/ GL_LUMINANCE,
/type/ GL_BITMAP,
/texels/ BITMAP_DATA
);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

// Show a quad with the texture
glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity ();
glTranslatef (200.0f, 50.0f, 0.0f);
//glBindTexture (GL_TEXTURE_2D, texObj);
glBegin (GL_QUADS);
glColor3f (1.0, 0.0, 0.0);
glTexCoord2f (0.0, 1.0); glVertex3f ( 0.0f, 0.0f, 0.0f);
glTexCoord2f (1.0, 1.0); glVertex3f (400.0f, 0.0f, 0.0f);
glTexCoord2f (1.0, 0.0); glVertex3f (400.0f, 600.0f, 0.0f);
glTexCoord2f (0.0, 0.0); glVertex3f ( 0.0f, 600.0f, 0.0f);
glEnd ();

SDL_GL_SwapBuffers ();

SDL_Event event;
bool done = false;

while (!done){

  SDL_PollEvent (&event);
  
  if (event.type == SDL_QUIT){
  	done = true;
  }else if (event.type == SDL_KEYUP){
  
  	if (event.key.keysym.sym == SDLK_ESCAPE)
  		done = true;
  }

}

SDL_Quit ();
glDeleteTextures (1, &texObj);

return 0;
}

Thanks for your help.

Is it even possible to create a texture from a bitmap?

Try enabling 2D textures

glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glBindTexture(GL_TEXTURE_2D, texName[1]);//if you had multiple textures this bind is like calling which is current - not really needed with 1 texture

Draw stuff

glDisable(GL_TEXTURE_2D);

Hope it helps

[This message has been edited by shinpaughp (edited 03-07-2003).]

Originally posted by shinpaughp:
[b]Try enabling 2D textures

glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glBindTexture(GL_TEXTURE_2D, texName[1]);//if you had multiple textures this bind is like calling which is current - not really needed with 1 texture

Draw stuff

glDisable(GL_TEXTURE_2D);

Hope it helps

[This message has been edited by shinpaughp (edited 03-07-2003).][/b]

Thanks for the idea, but I couldn’t get that to work.

Also, move your

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

between the bind and glTexImage2D

and if that doesn’t work just after the glTexImage2D function add

GLenum err = glGetError();
if (err)
exit(-1);

Let me know if it does error out and end up exiting and if so use a breakpoint in your debugger to tell me what value err is. It kinda looks like you rows and columns are not correct. I only see a 16 x 4 array instead of 16 x 8 as you are passing to glTexImage2D.

Okay, I got rid of your SDL stuff only because I didn’t feel like messing with it. Converted it into GLUT code. Anyway, unimportant. It was erroring out at the glTexImage2D function but it just posts the error and moves on.

The problem with glTexImage2D:
1)It didn’t like GL_BITMAP and I don’t know exactly what it is. Looked it up in the blue book and it was mentioned but gave no details. Anyway, changed it to GL_UNSIGNED_BYTE.

[This message has been edited by shinpaughp (edited 03-08-2003).]

It didn’t like GL_BITMAP and I don’t know exactly what it is. Looked it up in the blue book and it was mentioned but gave no details.

The error generated was probably GL_INVALID_ENUM, and this, among other things, is what MSDN says about it.

GL_INVALID_ENUM type was GL_BITMAP and format was not GL_COLOR_INDEX.

GL_LUMINANCE and GL_BITMAP is not a valid combination of format and type.

[This message has been edited by Bob (edited 03-08-2003).]

Originally posted by shinpaughp:
I only see a 16 x 4 array instead of 16 x 8 as you are passing to glTexImage2D.[/b]

The array is 16x1 bytes, which means the bitmap is 16x8 pixels. Either way, that conforms to OpenGL’s power-of-two texture dimension requirements. In case it is unclear, the pitch refers to the number of bytes per row. In this case, the pitch is one. The number of columns/pixels per row is 8*pitch.

Regardless, it’ll work if you switch GL_BITMAP to GL_UNSIGNED_BYTE in your call to glTexImage2D. Bob gave the reason why.

glGetError() is a very useful debugging tool.

Will that give the results he wants though? GL_UNSIGNED_BYTE will use a full byte for each color component, and I was under the impression he wants to use a single bit for each color component.

Anyway, if the only valid format for GL_BITMAP is GL_COLOR_INDEX, it seems to me that it probably isn’t going to work quite like you were hoping.

Shouldn’t matter. According to the notes section of opengl.org for alpha, luminance and intensity, you can use as few as 4 bits per pixel. At least that is the way it looks in the Internal Texture Formats table .

According to my testing of his code and Bob’s explanation of the reason that GL_BITMAP wouldn’t work (requires GL_COLOR_INDEX), if he doesn’t like the results he can change internal formats.

Oddly, I’m having trouble using glColorTable in combination with glTexImage2D using GL_BITMAP with GL_COLOR_TABLE. But, I’m trying to use the GL_COLOR_INDEX1EXT, GL_COLOR_INDEX4EXT, and GL_COLOR_INDEX8EXT but there seems to be no documentation except for the extension registry. Not that it would work when I used any other formats either. Ah, well.

[This message has been edited by shinpaughp (edited 03-10-2003).]

4 bits is not 1 bit. It sounds like what he’s trying to do is to use a bitmap (not a BMP file, but a true bitmap where a bit value of 1 is “on” and 0 is “off”) and use it as a texture.

I suppose if color index mode were used where the “on” and “off” colors were the only ones in the pallete, it would still be possible to get the desired effect using GL_COLOR_INDEX with GL_BITMAP. Other than that, the bitmap will probably have to be converted to a different format, or glBitmap will have to be good enough.

Edit: Thought a Bitmap example might help
describe what is wanted.

Here’s the BITMAP_DATA being used converted to binary. See the pattern?

0x41 =01000001
0x41 =01000001
0x42 =01000010
0x22 =00100010
0x22 =00100010
0x14 =00010100
0x14 =00010100
0x1C =00011100
0x0C =00001100
0x08 =00001000
0x08 =00001000
0x30 =00110000
0x00 =00000000
0x00 =00000000
0x00 =00000000
0x00 =00000000

The 1’s appear to form a y.

[This message has been edited by Deiussum (edited 03-10-2003).]

Then he should probably try the GL_COLOR_INDEX1_EXT as the internal format for glTexImage in conjunction with glColorTable to set up the 2 color palette, I got 8 and 4 working okay, under certain circumstances and yet sometimes it freezes my whole sytem up, but then probably more to do with my texture class and how it is set up. More work to do.

There is a good example of the glColorTable and glTexImage2D with the GL_COLOR_INDEX8_EXT at <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum2/HTML/008931.html" TARGET=_blank>http://www.opengl.org/discussion_boards/ubb/Forum2/HTML/008931.html </A>

Anyway, it seems he hasn’t been back of late, so it probably doesn’t matter anyway.

I’m not really convinced that changing the internal format will do much. What he needs is to tell it what format his data is already in, not how to store it. glTexImage2D will read the data based on how you tell it the data is arranged, so no matter how you store it internally, if you use GL_UNSIGNED_BYTE, it will think the data you give it uses a full byte per color component.

I’ve never really played with color index textures before. Maybe I should play with that a bit myself and see if I can get it working.

Anyway, as you said, the original poster doesn’t appear to be around anymore.

I do see what you mean about the internal format, and I guess after rereading the poster’s original post that is what he was looking for. ALthough doing it with GL_UNSIGNED_BYTE gave a really cool effect. Almost looked like a wavy red cloth or something.

But, anyways, I have been completely unable to get GL_BITMAP to work under any circumstances. However, when I use RealTech VRs extension viewer it states that there is no 8 bit palette support. Not sure if that is the problem.

If you or SpiffGQ get it working, please let me know. Too bad thee isn’t a better reference book or if they came out with the red book v1.4 or 2.0 with everything in it. Oh, well.

And, I did find an example at http://evcswhw.mujweb.cz/Priklad_VisualC.htm . I haven’t looked at it yet.

Originally posted by shinpaughp:
If you or SpiffGQ get it working, please let me know. Too bad thee isn’t a better reference book or if they came out with the red book v1.4 or 2.0 with everything in it. Oh, well.

I’ve been playing around with that color index stuff y’all posted, but I haven’t been able to get it working right.

Either way, it seems like it would be easier to convert the bitmap into a pixmap with one byte per pixel. This would allow me to create a texture that would behave as I want it to.

What I mean by the internal format is that the internal format parameter just tells the driver how to store the data internally. It doesn’t have anything to do with how the data you are passing to glTexImage2D is currently stored. (e.g. how your data is arranged in the memory address you are passing to glTexImage2D)

Anyway, it probably would be a bit easier to convert the bitmap to RGB image data, but now that I got started thinking about this problem, it’s got the wheels turning in my head. Working on it a little bit now. If I get something working, I’ll post some code.

Edit: Looks like I’ll have to wait until I get home to test my ideas. The Intel graphics chip on my work computer doesn’t support palletized textures. Argh!!

[This message has been edited by Deiussum (edited 03-11-2003).]

I’m about ready to give up. I got palettized textures working, but can’t seem to use a bitmap to index into a 2 color palette.

I looked at the extension registry for GL_EXT_paletted_texture here

And found this as an addition to the “Texturing” section of the OpenGL spec.

  If format is given as COLOR_INDEX then the image data is
  composed of integer values representing indices into a table of colors
  rather than colors themselves.  If internalformat is given as one of the
  color index formats from table 3.8 then the texture will be stored
  internally as indices rather than undergoing index-to-RGBA mapping as
  would previously have occurred.  In this case the only valid values for
  type are BYTE, UNSIGNED_BYTE, SHORT, UNSIGNED_SHORT, INT and
  UNSIGNED_INT.

Note that GL_BITMAP isn’t in there. It seems to imply that if the internal format isn’t one of the COLOR_INDEX values, that it will get converted to the format you specify. For instance if you specified GL_RGB8, it sounds like it should pull the RGB values from the palette and store it in the texture like that. From my testing, thta doesn’t seem to be the case, though.

Anyway, this paletted texture extension is one that I might have to play with some more. Seems like a good way to do some of the old palette tricks I used to mess with back in the days of programming Mode 13 in BASIC, or C/C++.

[This message has been edited by Deiussum (edited 03-12-2003).]

From the way those extension registry documents read, it sounds as if they are additons and modifications to an existing document. WIsh I could find a copy on the web somewhere that had all adendums added in. Maybe then it might make more sense. I read that GL_EXT_paletted_texture doc several times and it still doesn’t make much sense. I’ve looked at docs for pixelmap, pixeltransfer, pixelstore, colortable, getcolortable, index, etc and there is probably this one line that was accidentally omitted from some document that would make it all seem like DOH!

I am thinking however that it may be a good question for the advanced forum. It seems as though some people may have actually used GL_BITMAP and gotten it to work before. Though mostly for text/font stuff.

Indexing into a pixel map with only two entries seems to work for me…

float index[] = {0.0, 1.0};

glPixelMapfv(GL_PIXEL_MAP_I_TO_R, 2, index);
glPixelMapfv(GL_PIXEL_MAP_I_TO_G, 2, index);
glPixelMapfv(GL_PIXEL_MAP_I_TO_B, 2, index);
glPixelMapfv(GL_PIXEL_MAP_I_TO_A, 2, index);

glTexImage2D(GL_TEXTURE_2D, 0, 4, w, h, 0, GL_COLOR_INDEX, GL_BITMAP, textureImage);