I've implemented mouse picking of objects using the standard method of rendering objects with no lighting and with false colors corresponding to id numbers assigned to the objects, so that the resulting pixel under the mouse gives the object's id. The pick rendering is done in a hidden canvas (style.visibility = "hidden"). This works on Windows and Mac but fails on Ubuntu, and I've finally traced the failure to the fact that apparently no objects are actually rendered to the hidden canvas, so that readPixels always gets the color to which the background was cleared. This would seem to be a bug in WebGL on Ubuntu, in both Firefox and Chrome.

Presumably a more natural way is to render to texture memory associated with the main display canvas (no hidden canvas), but I'm puzzled as to how to handle antialiasing. In order for objects to look good, it's important that antialiasing be on, as is indeed the default when you create a WebGL context. But for false-color rendering, it's critically important that antialiasing be turned off, because pixel color is averaged between an object and the background, or between two adjacent objects, so that a pixel on the border between objects 8 and 10 can have the value 9, giving a wrong identification for the mouse pick.

As near as I can tell, there is only one opportunity to set antialias, which is at the time of creating the canvas: var gl = WebGLUtils.setupWebGL(canvasElement, {antialias: false}), where the default is antialias: true. I don't find any way to change the antialias behavior dynamically (e.g. set it false during a pick render).

Can someone give me guidance on how to deal with this? The only examples I've seen of this kind of GPU-based mouse picking in WebGL didn't deal with the antialiasing problem and perhaps didn't notice the problem because the objects in the demo were nicely spaced far apart from each other.