glReadPixels problem

i used glReadPixels to read back the color buffer, but the result seems not correct. and the alpha value of the pixel always are 0xFF, what’s wrong with me ? any suggestion?

The code list below:

int iCursorX = 88;
int iCursorY = 104;
GLuint aPixel;

glReadPixels( iCursorX, iCursorY, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &aPixel );

TInt8 iSelZ = aPixel&0xFF;
TInt8 iSelY = (aPixel>>8)&0xFF;
TInt8 iSelX = (aPixel>>16)&0xFF;

LOG("Select Piexel =%X x=%d, y=%d, z=%d
", aPixel,iSelX, iSelY, iSelZ);

Hi.

Do you take into account that OpenGL ES (and OpenGL) pixel system is kind of “upside down” compared to “normal 2D”. Read the specification of glReadPixels( ) to find out more information.

–jani;

The Screen coordinate system can’t influence my result. becoz the pixel i want to get is the center of the screen. and there is a colored pixel there obviosly. but i just get the wrong result.

>TInt8 iSelZ = aPixel&0xFF;
>TInt8 iSelY = (aPixel>>8)&0xFF;
>TInt8 iSelX = (aPixel>>16)&0xFF

Depending on your platform you get different results for this code. glReadPixels reads in 4 individual bytes to the integer that the code above accesses as int (-> endianess dependent). Are you sure that the shifts and masks are ok?

You also mentioned that your alpha is always 255. Do you have destination surface that has destination alpha? (e.g., in Series 60 OpenGL ES implementation using 16-bit format implies that the frame buffer format is 5-6-5 -> R=5,G=6,B=5,A=0).

–jani;

Thank you for reply.
after i adjust my code about initialise EGL, the code seems not work quite well. so i ddcide to post more detail here, i hope it’s helpful to find the fault.

//the Initialise EGL routine are listed below.

void InitGL()
{
EGLint attribList[ ] =
{ EGL_BUFFER_SIZE, 32,
EGL_DEPTH_SIZE, 16,
EGL_NONE
};

switch( iWindow.DisplayMode() )
{
case( EColor4K ) :
{ attribList[1] = 12; break; }
case( EColor64K ):
{ attribList[1] = 16; break; }
case( EColor16M ):
{ attribList[1] = 24; break; }
default: attribList[1] = 32;
}

EGLint numConfigs;
EGLint majorVersion;
EGLint minorVersion;

iGldisplay = eglGetDisplay( EGL_DEFAULT_DISPLAY );
if( iGldisplay == EGL_NO_DISPLAY )
{
User::Panic( _L("GL No Display"),0 );
}
if( !eglInitialize( iGldisplay, &majorVersion, &minorVersion ) ) 
{
	User::Panic( _L("GL Init"), 0 );
}
if( !eglChooseConfig( iGldisplay, attribList, &iGlconfig, 1, &numConfigs ) )
{
	User::Panic(_L("GL Config"), 0 );
}
iGlcontext = eglCreateContext( iGldisplay, iGlconfig, NULL, NULL );
if( iGlcontext==0 )
{
	User::Panic( _L("GL Context"), 0 );
}
iGlsurface = eglCreateWindowSurface( iGldisplay, iGlconfig, &iWindow, NULL );

if( iGlsurface==0 )
{
	User::Panic( _L("GL Surface"), 0 );
}

eglMakeCurrent( iGldisplay, iGlsurface, iGlsurface, iGlcontext );


glClearColor(0.0f, 0.0f, 0.0f, 0.0f);

glEnable(GL_CULL_FACE);
glEnable(GL_MULTISAMPLE);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glShadeModel(GL_FLAT);

glDisable(   GL_LIGHTING );
TSize Size = iWindow.Size();
glMatrixMode( GL_PROJECTION );
glLoadIdentity();
glFrustumf( -1.f, 1.f, -1.f, 1.f, 3.f, 100.f );	
glMatrixMode( GL_MODELVIEW );
glViewport( 0, 0, Size.iWidth, Size.iHeight );

}

//The render routine code is also listed
void Render()
{
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glLoadIdentity();
iCamera.Apply();

    RenderForSelect();
    ReadPixelFromPt(x, y);
     glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
    RenderNormal();  

}

/* the ReadPixelFromPt(x, y) is for framebuff readback. it’s just as mentioned before.
*/

the problem is that when the renderNormal()s not called, then the result is quite good. but if renderNormal()s called, everything goes wrong. and the different bettween the render and renderForSelect() is only the former function actives the texture mapping.

so the question is why the renderNorml influence ReadPixelFromPt so much.

what’s more, the alpha value is still 0xFF.

I believe you should add “EGL_ALPHA_SIZE, N,” to your attribList (N>0). Currently you don’t ask for an alpha buffer (and thus don’t probably get one). In case there is no destination alpha buffer, the alpha values returned are always 1.0 (i.e. 0xFF).

thanks Petri Kero, the alpha value read correct now. but why the renderNormal() influence the renderForSelect()'s result.i think it can’t. and i really need the renderNormal() function not influence the renderForSelect(), what can i do.

You’re welcome jessee,

You didn’t post the renderNormal() function, so I’m just making guesses here. One of the most common mistakes that people make with OpenGL (ES) is state leaking and I’m guessing that that’s what happens here as well.

What it means that in renderNormal() (or some other piece of code that you have), you’re setting some state, let’s say GL_LIGHTING, with glEnable() and forget to switch it back to the original value. The same could be done with pretty much any state in OpenGL ES, e.g. texture parameters/env variables, any light parameter, active textures, blending mode, stenciling, the list is endless.

This would cause the code to run correctly on the first frame and then after the renderNormal() has been called once, the state leak has occurred and everything will happen differently after that.

I would recommend you to check that all the states that you set have been correctly restored or that you set them to the proper values before executing the rendering command that is affected by them (i.e. the one from which you’re getting unsatisfactory results).

I hope this helps.

Cheers,
Petri

It’s amazing!!
After i adjust my code just as you said,the code does work well now. thanks Petri Kero, and thinks all.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.