powervr
September 18, 2004, 8:01pm
1
Hi, in my code, I have:
glBegin()
…
glEnd()
glutSwapBuffers();
//get the framebuffer
GLfloat tmpBuffer[800][800][4];
glReadPixels(0,0,800,800, GL_RGBA ,GL_FLOAT ,tmpBuffer);
//-------------------------------
I got the stack overflow problem…
What’s wrong with my code?
Thanks…
system
September 18, 2004, 9:01pm
2
hey powervr, try allocating the buffer memory on the heap, or move the array outside the function. under msvc, for example, the default stack size is 1 meg. 800x800x4 = 2.5 megs, just a hair too high
powervr
September 19, 2004, 2:26am
3
Bingo! I have solved the problem by moving the array outside the function()
Thanks!
powervr
September 19, 2004, 6:28am
4
another problem, here is my code:
…
GLuint tmpBuffer[200][200];
…
glBegin(GL_POLYGON)
…
glEnd()
…
glReadPixels(0, 0, 200, 200, GL_RGBA, GL_BITMAP, tmpBuffer);
Why the tmpBuffer return no value?
If I replace GL_BITMAP with GL_UNSIGNED_BYTE
, it is okay.
any suggestion?
Thanks
Bob
September 19, 2004, 10:18am
5
GL_BITMAP is not a valid type unless you’re in indexed color mode.
powervr
September 19, 2004, 9:06pm
6
So if i am now in RGBA , double buffer mode,
what should i do in order to get the buffer as?
e.g.
000001111111111100000
000001000000000100000
000001000000000100000
000001000000000100000
000001000000000100000
000001000000000100000
000001111111111100000
(i don’t care the values in RGB format, i just care if any pixel is draw on the screen.)
Thanks
Bob
September 20, 2004, 1:02am
7
Then I suggest you use the stencil buffer to mark pixels as drawn, and read it back. If you need a bitmask of all pixels, then I’m afraid you have to build the mask yourself from the individual pixels.