PDA

View Full Version : Vertex shader texture fetching problem



somboon
08-23-2009, 05:19 AM
I had a problem implementing texture fetching in vertex shader.
Actually I try to implement hardware skinning ,
using texture to store quaternion/translation.

The problem is vertex texture fetching only work when using 8 bit texture
format.

When I try 32bit format such as GL_RGBA_FLOAT32_ATI/GL_RGBA32F_ARB the program does't crash
but the color/data value is alway zero/black.

My video card is ATI HD4670 with the lastest driver.

This is my vertex shader.



uniform sampler2D testTex;
varying vec4 debugColor;
void main(){

debugColor = texture2D(debugColor,vec2(0.5,0.5));
debugColor[3] = 1.0;//manually replace alpha

gl_Position = ftransform();

}


This is my fragment shader.



varying vec4 debugColor;
void main(){
gl_FragColor = debugColor
}


and this is my texture setup that doesnt work the debugColor alway return
as black.



float* data = new float[16*16*4];
for(int i=0;i<;i++){
data[i] = 0.5f;
}

glGenTextures(1,&amp;this->finalRotTextureID);
glBindTexture(GL_TEXTURE_2D,this->finalRotTextureID);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA_FLOAT32_ATI,1 6,16,0,GL_RGBA_FLOAT32_ATI
,GL_FLOAT,data);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_NEAREST);



I also try replacing GL_RGBA_FLOAT32_ATI with GL_RGBA32F_ARB.

But if i change the code to

glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,16,16,0,GL_RG BA
,GL_FLOAT,data);


the mesh are render in grey color as expect.

Can someone tell me why I cant do a 32bit vertex texture fetching ?

earthquad
08-23-2009, 06:26 AM
please see the glsl spec on texture functions. implicit lods are unsupported in the vertex shader. you need to use one of the *lod flavors or such to sample at the vertex level.

Ilian Dinev
08-23-2009, 11:09 AM
Simply replace that line with
debugColor = texture2DLod(testTex,vec2(0.5,0.5),0.0);

Vertex-shaders can't compute derivatives (which are necessary to compute texture LOD), so you have to specify LOD=0 manually. On nVidia drivers it's done automatically, ATi's drivers are more picky.

kyle_
08-23-2009, 11:18 AM
Actually, original code is correct - texture2D just dont accept third parameter in vertex shader.
And in reality nvidia actually does accept it and treats as absolute lod (ie. just like *Lod functions) while ati disregards it (though i only checked that on recent drivers and hw).

edit:
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA_FLOAT32_ATI,1 6,16,0,GL_RGBA_FLOAT32_ATI
,GL_FLOAT,data);

GL_RGBA_FLOAT32_ATI is internalformat, use GL_RGBA for format parameter

somboon
08-23-2009, 06:15 PM
Thank everyone for your answer.

Now it working as expect.

The performances gain is pretty insane
from 220 FPS (software skinning) to 800 FPS (vertex texture fetch hardware skinning).

I also try this on my old geforce 6600 (non GT) and had a
performances gain from 115 FPS to 180 FPS

:D

Dark Photon
08-23-2009, 06:39 PM
The performances gain is pretty insane
from 220 FPS (software skinning) to 800 FPS (vertex texture fetch hardware skinning).
FPS is a poor way to compare perf since it's not linear. Better to state sec (not 1/sec) per frame:

In your case, 4.5ms -> 1.3ms per frame. Sure does sound like that was your main bottleneck though!


I also try this on my old geforce 6600 (non GT) and had a performances gain from 115 FPS to 180 FPS
8.7ms -> 5.6ms.