Are there any rules for creating an optimal window for OpenGL? Specifically with ChangeDisplaySettings and SetPixelFormat? So if I tested and found my display was using 24 bit colour should I try to switch it to 16 bit or 32 bit colour?
I also noticed that when I enable the depth buffer my framerate drops to half. I was wondering if I was ending up in some software only mode. If that is the case, what can I do to detect which modes are better than which on any computer and choose that best mode?
I’m using VC 6 (no MFC, no GLUT) and I don’t want to change that for now.
Originally posted by Furrage: Are there any rules for creating an optimal window for OpenGL? Specifically with ChangeDisplaySettings and SetPixelFormat? So if I tested and found my display was using 24 bit colour should I try to switch it to 16 bit or 32 bit colour?
Try to match display color depth and rendering depth.
IE 16bit displaymode -> 16bit pixelformat w 16bit z, if required; 32 bit display ->32bit pixelformat w 32bit z.
Now, if you really have a 24bit display, go for a 32bit pixelformat. If you can get a genuine 24bit pixelformat, you might as well go for it, but I doubt it. That’s very uncommon, I believe Matrox was one of the few chip vendors with that option.
I also noticed that when I enable the depth buffer my framerate drops to half. I was wondering if I was ending up in some software only mode. If that is the case, what can I do to detect which modes are better than which on any computer and choose that best mode?
What kind of scene was that?
If you tested that just with an (almost) empty screen, that’s exactly what you should expect. After all, the card has to move twice the amount of data.
If you hit a software path, performance usually drops drop a lot more.
What should I do in these cases. Also It says no Alpha. Does that mean I will not be able to show transparent objects?
[Edit]How do I avoid/minimise software paths (I know these cards are not fully OpenGL compliant). And if I wanted the fastest rendering/texturing do I have to use trial and error or are there some guidelines for this. Assume I’ve already taken care of LOD, BSP Trees and other stuff that is not OpenGL specific. I’m more concerned about Device Mode, Pixel Format, and stuff to glEnable/glDisable. Face culling doesn’t count here cause I am only experimenting with three Quads.
[This message has been edited by Furrage (edited 04-15-2002).]
btw check first to see if quake3 does run at a reasonable speed on your machine first, if it doesnt get the latest drivers from sis + if that doesnt help perhaps u cant get opengl hardware acceleration