How do I change to 16-bit for voodoo cards?

I’m using NeHe’s framework for my OpenGL apps (because GLUT is too inflexible).

Here’s the problem. Voodoo (pre-4 and 5) can only handle 16-bit color right? So how can I modify NeHe’s framework to detect a voodoo card and select 16-bit mode?

One way to detect the video card is to parse the GL_RENDERER string. This string is supposed to be a hardware specific string. I notice that for my Voodoo2, OpenGL returns “3dfx/Voodoo2/2 TMUs/4 MB/3DNow!/stand-alone (Jan 12 2000)” for the GL_RENDERER string. But you can’t get that string until you have a RC. So you might be better off using either the Win32 api or DirectX to enumerate display devices. Then if the user chooses to use an old Voodoo card you can switch the video mode if necessary (again with either DirectX or the Win32 api). Then create the OpenGL RC.

[This message has been edited by DFrey (edited 06-28-2000).]

pixelformatdescriptor. Check out VC++ help on that.