PDA

View Full Version : render to ALPHA texture sux with NVIDIA cards ?!



hgore
03-11-2005, 04:10 AM
dear all !

i have a rather simple algorithm that renders an object to a texture. now i want to use several of these textures to fake some kind of "motionblur".
i do the following:

glClearColor(1.0f, 1.0f, 1.0f, 0.f); //ALPHA 0 !!
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
pMyObject->Render();
glBindTexture(GL_TEXTURE_2D,myTextureGLINT);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, g_Viewport, g_Viewport, 0);

later on i render this texture via:

glColor4f(1.f,1.f,1.f,alpha); //alphafade
glBindTexture(GL_TEXTURE_2D, myTextureGLINT);

so far so good - works totally nice on ATI cards - but not at all on NVIDIA (latest driver - no question, different models) - any ideas on this ???
do i eventually have to use some "strange" NV_ extension to make this work ??
please help !!!

thnx & many greetings
k.

Silkut
03-13-2005, 07:20 AM
Is it because my shaders didn't work with ATI cards that I say: "OMFG, ATI suxx !" ? No, I say: "Erks, another crappy bug, ok no panic, i'll debug this"

I'm quite sure there is some craps or mistakes in your code, not especially inside your transformation function..

vincoof
03-13-2005, 12:04 PM
Are you sure your buffer uses an alpha channel ?
please call glGetInteger with GL_ALPHA_BITS.