render to ALPHA texture sux with NVIDIA cards ?!

dear all !

i have a rather simple algorithm that renders an object to a texture. now i want to use several of these textures to fake some kind of “motionblur”.
i do the following:

glClearColor(1.0f, 1.0f, 1.0f, 0.f); //ALPHA 0 !!
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
pMyObject->Render();
glBindTexture(GL_TEXTURE_2D,myTextureGLINT);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, g_Viewport, g_Viewport, 0);

later on i render this texture via:

glColor4f(1.f,1.f,1.f,alpha); //alphafade
glBindTexture(GL_TEXTURE_2D, myTextureGLINT);

so far so good - works totally nice on ATI cards - but not at all on NVIDIA (latest driver - no question, different models) - any ideas on this ???
do i eventually have to use some “strange” NV_ extension to make this work ??
please help !!!

thnx & many greetings
k.

Is it because my shaders didn’t work with ATI cards that I say: “OMFG, ATI suxx !” ? No, I say: “Erks, another crappy bug, ok no panic, i’ll debug this”

I’m quite sure there is some craps or mistakes in your code, not especially inside your transformation function…

Are you sure your buffer uses an alpha channel ?
please call glGetInteger with GL_ALPHA_BITS.