GPU picking on Ubuntu

I’ve implemented mouse picking of objects using the standard method of rendering objects with no lighting and with false colors corresponding to id numbers assigned to the objects, so that the resulting pixel under the mouse gives the object’s id. The pick rendering is done in a hidden canvas (style.visibility = “hidden”). This works on Windows and Mac but fails on Ubuntu, and I’ve finally traced the failure to the fact that apparently no objects are actually rendered to the hidden canvas, so that readPixels always gets the color to which the background was cleared. This would seem to be a bug in WebGL on Ubuntu, in both Firefox and Chrome.

Presumably a more natural way is to render to texture memory associated with the main display canvas (no hidden canvas), but I’m puzzled as to how to handle antialiasing. In order for objects to look good, it’s important that antialiasing be on, as is indeed the default when you create a WebGL context. But for false-color rendering, it’s critically important that antialiasing be turned off, because pixel color is averaged between an object and the background, or between two adjacent objects, so that a pixel on the border between objects 8 and 10 can have the value 9, giving a wrong identification for the mouse pick.

As near as I can tell, there is only one opportunity to set antialias, which is at the time of creating the canvas: var gl = WebGLUtils.setupWebGL(canvasElement, {antialias: false}), where the default is antialias: true. I don’t find any way to change the antialias behavior dynamically (e.g. set it false during a pick render).

Can someone give me guidance on how to deal with this? The only examples I’ve seen of this kind of GPU-based mouse picking in WebGL didn’t deal with the antialiasing problem and perhaps didn’t notice the problem because the objects in the demo were nicely spaced far apart from each other.

I removed the hidden style from the pick canvas but kept the transparent style – no change. Nor does it matter whether the pick canvas has the transparent style or not.

Here is the actual code I’m using:

       this.__canvas_element = document.createElement('canvas')
       this.__pick_element = document.createElement('canvas')
       this.__overlay_element = document.createElement('canvas')

        var cv = this.__canvas_element // the main display canvas
        cv.style.position = "absolute"
        
        var cvpick = this.__pick_element
        cvpick.style.position = "relative"
        cvpick.style.visibility = "hidden" // can be omitted
        cvpick.style.backgroundColor = "transparent" // can be omitted

        var overlay = this.__overlay_element
        overlay.style.position = "relative"
        overlay.style.backgroundColor = "transparent"

        this.__renderer = new WebGLRenderer( this, cv, cvpick, overlay )

function WebGLRenderer(canvas, canvasElement, pickElement, overlay) {
    
    var renderer = this
    var this.gl = WebGLUtils.setupWebGL(canvasElement) // main canvas
    // canvas for picking:
    var this.glpick = WebGLUtils.setupWebGL(pickElement, {antialias: false})
    canvas.overlay_context = overlay.getContext("2d") // for 2D labels

If the main render routine is called with pick = true, gl is set to this.glpick, special shaders are invoked (no lighting), colors are set to object id numbers.

I “fixed” my problem – I’m able to pick on Linux (as well as on Windows and Mac), but I don’t understand.

To pick, I bound to the hidden canvas (with antialiasing turned off) a fragment shader that was simply gl_FragColor = mat_color, which fails on Linux. What works on all platforms is to bind to the hidden canvas the same fragment shader used for regular rendering, modified to contain the following, where pick is a uniform equal to 0 for regular rendering and 1 for picking:

if (pick != 0) {
gl_FragColor = mat_color;
} else {
vec3 color = lighting() // code not shown here
gl_FragColor = vec4( color, 1.0 );
}

I don’t understand why this works and the simpler fragment shader doesn’t, nor why the simpler scheme works on Windows and Mac but not on Linux.