Anaglyphs without glColorMask?

Hi,

I want to develop an anaglyph application where I can control color values. This mean for example to set blue little bit darker etc.

I mean how can I create anaglyphs without glColorMask ?

Can anybody help me?
Thanks!

In your shaders, you could set red to 0.0 and green to 0.0 on the first pass.
In the second pass, green is 0.0 and blue is 0.0.
The second pass would need to be rendered to a RTT.
Then render the RTT onto the framebuffer with glBlendFunc(GL_ONE, GL_ONE). That will add up.

I’m a beginner in OGL.
How can I set shaders? Which do you mean. Maybe I don’t know what you mean but already used it.

What is RTT?

Thanks for your answer

RTT = render to texture.
If you are a beginner, I’d stick to glColorMask. It is a bit crappy, i mean you can’t do optimised anaglyp with it, but at the same time it’s extremly fast.

If you want to do this you need to render the scene to texture (maybe one texture for left eye, one for right), then write a shader (GLSL) to remap the colours.

http://www.3dtv.at/knowhow/anaglyphcomparison_en.aspx

Look for optimised anaglyph on that page, it’s the one least likely to give you eye cancer. If you look at the images, you can see that red has totally disappeared. The idea is to make the brightness of the left and right eye the same. It’s still gonna screw with your eyes, but it wont suck as hard as regular anaglyph.

Thanks.

I would like to use glColorMask.
But my exercise is to develop an optimised anaglyph application.

I think I have to learn GLSL.

Now I know how to work with shaders and I want to use it for colorcode anaglyphs.

But I don’t understand why I have to render to texture and how shader knows the two textures (left and right eye).

Can you help me again?

Why : because it will allow fine tuning of the colors for best anaglyph reproduction.
How shaders know the left and right eye : because you will tell it.

Outline of a shader-based solution :

  1. render left eye scene to texture 1, full color
  2. render right eye scene to texture 2, full color
  3. bind texture 1 and 2
  4. use shader that samples left eye from texture 1, and right eye for texture 2
  5. color code is yellow/blue right ? so roughly the GLSL fragment shader will look like that :

lefteye = texture2D(leftTexture, currentTexcoord).rg * 0.7; // take only red and green components from left eye
righteye = texture2D(rightTexture, currentTexcoord).b; // take only blue component from left eye
gl_FragColor = lefteye+righteye;

EDIT : why you did not use simple glColorMask solution ? FOr such a simple setup, it will work well and be easier to implement (no need for RTT or shaders, just render twice with different color mask).

I don’t use the glcolormask method because the colors doesn’t fit to the color of the glasses. So I want to change colors during runtime of the applikation. And this doesn’t work with colormask. Only with accumulation buffer.

I developed anaglyphs where I can change the colors (for example red little bit darker) - to adjust it to the color glasses.

So I simply draw one object in one color (e.g. red) and one in
the other color (e.g. blue).

Now if I have an existing application. Of course I can change all colors and textures manually. But I want to do it easier like this.

And my question is if this is possibile when I want to adjust colors for glasses.

Maybe with shaders? And someone give me an example.

You post sound like you did not read my post just above, because I exactly outlined the solution for you.
I conveniently numbered each important step.
Please, can you tell which part(s) you need explained with more details ??

However you have to know that the color spectrum for each primary screen color (red green blue) is fixed. No amount of color tuning can help if more than one of these colors shows through both filters significantly.
Even if the eye response is the same, you can have very different spectrums for the same apparent color.

Most of the time, pure red versus pure blue works without ghosting, but incorporating the green component often brings ghosting. Finely tuned filters for your screen should be used to avoid this.

Now I read a lot about glsl and wrote a shader for my anaglyphs. I’m doing the following steps:

  1. rendering left eye into a texture with GL_LUMINANCE value
  2. rendering right eye into a texture with GL_LUMINANCE value
  3. activate shader
  4. render resulting texture on a full-screen quad

But the problem I have is that the background color is always present in the calculation with the shader.
For example if I have red-blue anaglyphs. If I have a white background (left image) after activating of the shader it should be pink. But it is red and the blue side seems transparent in some way.
If I change the background to black (right image), red overlaying part changes to blue.

The two cubes in front look good. The color of them doesn’t change. So I thought it must depend on the background color.

I tried to change blending and alpha values but I didn’t get a correct result.

Can somebody help me? I don’t know what I’m doing wrong.

My shader:


uniform sampler2D left;
uniform sampler2D right;

void main(void)
{
  vec4 cleft = texture2D(left, gl_TexCoord[0].st).rgba;
  vec4 cright = texture2D(right, gl_TexCoord[0].st).rgba;
  vec3 col;

  col.r = (cleft.r + cleft.g + cleft.b) / 3.0;
  col.g = 0.0;
  col.b = (cright.r + cright.g + cright.b) / 3.0;

  gl_FragColor = vec4(col, cleft.a+cright.a);
}

Shader look good.
Did you forget about step ‘5. deactivate shader’ ? Important if steps 1 and 2 do not use shaders.

Can you detail some more how you do steps 1 and 2 ?
When you say ‘background’ you mean something done with glClear ? Then do a glClear for each step, both 1 and 2.
Are you using some cliping or scissors ?

yes of course I deactivate shader. Sorry, I forgot it.

In detail my steps 1 and 2:


/** LEFT EYE **/
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glClearColor(0.0, 0.0, 0.0, 0.0);

glActiveTexture(GL_TEXTURE0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustum(L_l, L_r, L_b, L_t, Near, Far);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(-IOD / 2, 0.0, 5.0, 
                0.0, 0.0, 0.0, 
                0.0, 1.0, 0.);

//Render Background (Big quad)
//Render two cubes in front
glFlush();

glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texLeft);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 5, 5, 0, 0, SIZE - 10, SIZE - 10); //SIZE - 10, SIZE - 10);


 glClear(GL_DEPTH_BUFFER_BIT);
 glClearColor(0.0, 0.0, 0.0, 0.0);


/** RIGHT EYE **/
glActiveTexture(GL_TEXTURE0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustum(R_l, R_r, R_b, R_t, Near, Far);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(IOD / 2, 0.0, 5.0, 
               0.0, 0.0, 0.0, 
                0.0, 1.0, 0.);
 
//Render Background (Big quad)
//Render two cubes in front
glFlush();

glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, texRight);

glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 5, 5, 0, 0, SIZE - 10, SIZE - 10);
        

/** reset **/
glBindTexture(GL_TEXTURE_2D, 0);
glClearColor... and glClear(...)

And here my definition of the textures. Maybe there is something wrong:


glGenTextures(1, &texLeft);
    glBindTexture(GL_TEXTURE_2D, texLeft);
    glTexImage2D(GL_TEXTURE_2D, 0, 4, SIZE, SIZE, 0, GL_LUMINANCE_ALPHA,
            GL_UNSIGNED_BYTE, textureL);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);

    glGenTextures(1, &texRight);
    glBindTexture(GL_TEXTURE_2D, texRight);
    glTexImage2D(GL_TEXTURE_2D, 0, 4, SIZE, SIZE, 0, GL_LUMINANCE_ALPHA,
            GL_UNSIGNED_BYTE, textureR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);


    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, texLeft);
    glActiveTexture(GL_TEXTURE1);
    glBindTexture(GL_TEXTURE_2D, texRight);


    setShaders();


    texLeftSampler = glGetUniformLocation(p_shader, "left");
    glUniform1i(texLeftSampler, 0); // 0 für Texture-Unit 0

    texRightSampler = glGetUniformLocation(p_shader, "right");
    glUniform1i(texRightSampler, 1); // 1 für Texture-Unit 1

And the result is what you can see in the picture.

I have no idea why one side is red and the other isn’t blue but pink.

Now I did a very important step. Everything looks good now. I have still the problem that the borders changes the colors if I change the background color from white to black.

Any ideas?

I thought maybe the solution is to set the background transparent of the “picture” which I render into a texture. And maybe the background will be ignored by the shader and background color doesn’t take on color like definied in the shader.

But I don’t know if this works and how to realize it.