stencil buffer

hello,

how can i check if my graphic card support a stencil buffer?

can i do this with:

glGetIntegerv(GL_STENCIL_BITS, &stencil);

because when i print the integer stencil the value is 0.

thanks

helda

How do you create your window? Your probably didn’t create a window that has a stencil buffer. If you’re using Win32, there’s a cStencilBits field in the PIXELFORMATDESCRIPTOR struct that you use. If you are using glut, you use GLUT_STENCIL for one of the options in the glutInitDisplayMode function.

[This message has been edited by Deiussum (edited 11-06-2002).]

o **** i’m so stupid, thank you for that tip.

hmmmm anyone have any idea why stencil buffer might be slow??I tried it with 16,8,4,2,1 bits but its always slow like hell.My comp is a p4 1.4 with a gforce 2mx 400 64 ram and 256 mgs of memory O_o.
And one thing i noticed is that it gets faster when the object that is rendered into the stencil buffer get away from the camera.

Are you asking for a stencil buffer in 16 bit color depth? If so, you’re not getting hardware accelerated rendering. Stencil buffers are only supported in 32 bit color depth on the earlier GeForce series.

So i need to set color depth to 32 and what about the stencil bit??Does that matter much?

Stencil should be 8 bits. And I suppose it matters much; software vs hardware renering.