Why my vertex shader is killing the framerate

i’m simply trying to use a heightmap to make a terrain with glsl

this is my vertex shader.

uniform sampler2D texx;
varying vec2 vTexCoord;
void main(void)
{
vec4 position = vec4(gl_Vertex.x, 0.0, gl_Vertex.z, 1.0);
gl_TexCoord[0] = gl_MultiTexCoord0;
vec4 color = texture2D(texx,gl_TexCoord[0].st);
position.y=gl_Vertex.y+((color.r+color.g+color.b)*10);
gl_Position = gl_ModelViewProjectionMatrix * position;
}

it works but it is very slow. i am using GL_NEAREST for filtering and i’m pretty sure something stupid that i’ve done.
thanx in advance.

Hardware ? VTF can be very slow on some cards. How many vertices in your mesh ?

If your replace the texture2D() with a constant, I guess it is very vast ?

What ZbufferR said. I’d bet money it’s falling back to software for the VTF.

Hardware? ATI by any chance?

i have nvidia 7800 gt, and the mesh is 256x256
i had expected more from the card =/

plus i thought that glsl is implemented in purely harware in the pipeline?

Performance will be OK (although not exceptional) with VTF on GT7800 as long as you use one of the two texture formats that are hardware-accelerated in the VS - GL_LUMINANCE32F_ARB and GL_RGBA32F_ARB. Make sure you stick to NEAREST though even with these to avoid the software fallback.

i know this might sound stupid but you mean to use my textures using :
glTexImage2D(GL_TEXTURE_2D, 0, type, width, height, 0, GL_RGBA32F_ARB, GL_UNSIGNED_BYTE, imageData);

??
because when i tried that the texture doesn’t get passed to down to program shader at all. =S

You need to bind your texture to the current program using glGetUniformLocation to retrieve the sampler in tandem with glUniform1i to set its value.

problem solved:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA_FLOAT32_ATI, texture[0].width, texture[0].height, 0, type, GL_UNSIGNED_BYTE, texture[0].imageData);

i had to use GL_RGBA_FLOAT32_ATI anything else would fall back on software…

Hmmm… That’s impossible. Something else must have changed since (from glext.h):

#define GL_RGBA32F_ARB                    0x8814
#define GL_RGBA_FLOAT32_ATI               0x8814

>>glTexImage2D(GL_TEXTURE_2D, 0, type, width, height, 0, GL_RGBA32F_ARB, GL_UNSIGNED_BYTE, imageData);<<

This didn’t work because your user data format is invalid.
glGetError() would have told you.
It should have been GL_RGBA.
The GL_RGBA32F_ARB belongs into the internalFormat parameter!

oh yes that too :stuck_out_tongue:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.