Hello,
I’m developing Unity3D plugin for Android. Currently I encounter the strange problem just like title.
The process is the same,
[ol]
[li]glBindTexture
[/li][li]glReadPixels
[/li][/ol]
I have two cameras in Unity3D scene, which one is main camera and the other is render camera.
I pass the render camera texture ID to OpenGL ES
renderCamera.targetTexture.GetNativeTexturePtr();
Bind the texture by
glBindTexture(GL_TEXTURE_2D, (GLuint) textureId);
checkGLError("glBindTexture");
Get the pixels data by
pixels = (GLubyte *) malloc(4 * width * height);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, pixels);
However, the pixel data by using OpenGL ES 2.0 is correct which comes from the render camera, by using OpenGL ES 3.0 is wrong which comes from the main camera and only one-fourth region of the render texture!
I’m stuck here~~~
Anyone can help me?