GPU produces innacurate results on Nexus 10

Edit: Solved. The problem was that I wasn’t setting the precision, so it used low precision (or possibly medium precision). Adding “precision highp float;” at the start of my vertex shaders made the issue go away.

I’m encountering a bunch of related errors on one particular device (a Nexus 10).

To summarize:

[ul]
[li]The farther a model is from the origin, the less accurately it is positioned when rendered.
[/li][li]If the depth buffer is enabled, models will often fail to be drawn, even though they are fully between the clipping planes. This effect gets worse the farther the camera and models are from the origin.
[/li][li]If the depth buffer is enabled, models will sometimes register the wrong depth, making it look like the player character is buried in the ground.
[/li][li]Models farther than 65535 units from the origin are not rendered.
[/li][/ul]
That last one strikes me as particularly important - 65535 is the highest value an unsigned short can take on. If this is enough of a clue, feel free to stop reading here. Assuming it’s not, I have a bunch more analysis that may help.

I am almost certain that this is related to the behavior of my vertex shaders. I’m using multiple, but they all result in the exact same error, so I’ll just post the simplest one.

Initially, the shader looked like this:


attribute vec3 aPosition;
attribute vec3 aColor;
uniform mat4 uModelMatrix;
uniform mat4 uViewProjectionMatrix;
varying vec4 vColor;
void main() {
 gl_Position = uViewProjectionMatrix * (uModelMatrix * vec4(aPosition, 1.0));
 vColor = vec4(aColor, 1.0);
}

While writing this post, I realized that I should combine the matrices on the CPU and pass them to the shader as one matrix. Subsequently, my shader looked like this:


attribute vec3 aPosition;
attribute vec3 aColor;
uniform mat4 uModelViewProjectionMatrix;
varying vec4 vColor;
void main() {
 gl_Position = uModelViewProjectionMatrix * vec4(aPosition, 1.0);
 vColor = vec4(aColor, 1.0);
}

I’m posting both because the change made a serious difference. Before the change, everything failed equally. (Ignoring the skybox.) Everything vibrated as the camera moved, everything flickered, and everything eventually vanished after 65535 units. However, following the change, only certain models failed.

For some models, I placed their vertices near the origin, and then relied on the model matrix to move them into position. For other models, I precomputed the location of their vertices in space, and left the model matrix as an identity matrix. It was this latter group that continued to vibrate, flicker, and vanish.

It seems to me that this device’s GPU gives the best results if it only deals with small numbers. (Not perfect results, but good approximations.)

For example, the player character no longer vibrates, flickers, or vanishes. This is because its vertex buffer contains only low coordinate values (as opposed to other models, whose vertex buffers contain values above 65535). Furthermore, its ModelViewProjection matrix does not translate it very far. Therefore, when rendering it, the GPU never has to deal with particularly large numbers.

However, back when I was using that first shader, the GPU had to deal with an intermediate step. First the Model matrix was applied, moving the model to +65535 units. Then the ViewProjection matrix was applied, moving the model most of the way back to the origin. But because the GPU had to calculate that intermediate value, some sort of error was introduced, and the player vanished along with everything else.

I’d like to emphasize that I’ve only seen these errors on this one device. On other devices and targets, including an old Nexus S and a new Windows laptop, the program performs exactly as expected.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.