Textures in Vertex Shader slows down FPS

Hi everyone,

As I understood :o , I will gain faster Frame rate Per Second(FPS) if I updated my data in VS shader without need any CPU cycles, therefore I used texture to store my height map and I have just accessed to the texture from VS and update each vertex Z coordinate as height and to speed up the rendering I stored my vertexes in an vertex array…
Unfortunately, I had a much slower FPS than those without using textures, I am saying textures, because I tried to update vertexes without using textures and I had about 95 FPS but whenever I used a texture the FPS becomes 1 :stuck_out_tongue: which shocked me. So could anyone explain where I did error or you might tell me some of your experience about this idea.

Thanks in advance for any kind of help.

Looks like your vertex texturing runs in software emulation.

Vertex textures are supported on GeForce 6/7 with some very strict limitations (search NVIDIA’s developer pages for a .pdf on that). If these restrictions are not met, it will run in software emulation.

Vertex textures are supported with no limitations on DX10 class hardware.

Anything else will run vertex textures in software mode.

alright, actually my desktop has NVIDIA Quadro FX 4500 graphics card, and I have used GPU Caps Viewer for windows xp :

GPU Caps Viewer: Graphics card and GPU information utility, OpenGL, OpenCL and CUDA API support, NVIDIA GeForce, ATI Radeon | oZone3D.Net
[/b]

in order to get more info about mine and it showed that I have 0Mb for texture buffer size, could that related to the problem I had…

Moreover, on my laptop I had NVIDIA Geforce 8600 GS ! does this fine for using textures on GPU and the GPU Caps Viewer says that the texture buffer size is 128Mb, if what I thought is right that means I should get my textures work on my laptop but the problem now is different :eek: my textures don’t work at all i.e :
I stored my data in the textures, which works fine but if I tried to get value from texture like :

vec4(texel) = texture2D(texturei,vec2(coord));

the program crashed :sorrow: , So could you see where is the problem now!!

Regarding to the graphics card limit, do you know any other way to play around this problem, unless I have to order new graphics card :confused:

any comments are appreciated

Did you try latest drivers from nvidia.com website ?

Why do you think like that?
Doesn’t my Graphics card support textures?
Any way I have checked the NVIDIA website and it said that both my Graphics card on laptop “Geforce 8600” GS and desktop “Quadro FX 4500” support opengl and GPU includes textures.

How could be, that they don’t support textures and I have got the max size of each texture and the max number of textures and TIU that could be use in VS and FS as well.

Moreover, How could I know that my textures are not on GPU?
As I understood, that could happened if the size of the textures exceed the limit of the textures.

I would appreciate some advices and comments

The Gf 8600, 8800 and so on are DX10 SM 4.0 GPUs. They support all texture formats from within the vertex shader.
The FX 4500 that you have is a DX9 SM 3.0 GPU according to this
http://features.cgsociety.org/story_custom.php?story_id=3321&page=

in which case you need to use GL_LUMINANCE32F_ARB or GL_RGBA32F_ARB

for more info,
http://www.opengl.org/wiki/Vertex_Texture_Fetch

Okay, I have tried to use a suitable format that is GL_RGBA32F_ARB but it says that are not defined. How can I make it recognized should I use another library?

Do you include glext.h ?

no, should I?

In your opinion ?

http://www.opengl.org/wiki/Getting_started#OpenGL_2.0.2B_and_extensions

Thanks for all of you, but I still have problem, because I have visted the mentioned link for glext library but I couldn’t get the library there is not (or it wasn’t to me clear) a package that one can get the appropriate library in my case VC++2008,so I have found only .h files but where can I can get .lib files in order to be ready with the glext.
Other question will be there problem if I use glew and glext libraries in the mean of common error where gl.h is already defined ?

Did you look at http://glew.sourceforge.net/install.html ? It has binaries and install instructions.

Yes, I did and I have found the binary and header files for GLEW and it says that we have to include glext.h( and they haven’t mention any thing about its binary file glext.lib or glext.cpp for e.g) as gl.h after glew.h to avoid having an error and I have used a copy of glext.h in my GL file, !!!the new thing I have got (only on my laptop Gf 8600M GS, because currently I have a virus attack on my desk top…) !!! is that I can use the texture internal format even if I don’t include glext.h, this when I used all relative texture commands with ARB except

glBindTextureARB(),

where it says that it is not defined function therefore I have used

glBindTexture() 

function, but I still can’t use the returned value from texture as height value in the current vertex ??? Does that mean that my problem is not the internal texture format and how can I check if the internal format is working well or not …

Other question if the internal texture format is not supported from my Gf 8600M GS I should get error from the first or ??

and now I would like to know when I have to use glext with glew and when it is sufficient to use glew alone… and was it really that my last problem was because I have to use all functions with ARB or without ???

Any way I can get the internal texture format in my case GL_RGBA32F_ARB , but it didn’t solve my problem :frowning: I still have crashed program if I want to get height from texture in VS?

I have done another trial, so I have taken the vertex (x,y,z = height from text) to pass them the the fragment and have a look where is the problem like:

//fragment shader

gl_FragColor = new_vertex;

for vertex(x,y,0) works but vertex(x,y,z) crashes

So what else can I do to figure out the problem…

Thanks for any help.

:wink: it’s me again, texture internal format works now and it is defined in both glew and glext so I had used glew to get it works.

OKAY NOW TO OVERCOME CRASHING PROBLEM

I had another trial on Gf 8600M GS, where I found that the only thing that makes the program crashes is using :

//vertex shader
texture2D(texturei,coordinates); 

in vertex shader so what I had done is computing the texture coordinates in vertex shader send them to the fragment shader and there I used

 texture2D(); 

to get height value and then asign the new vertex to gl_FragColor and it works well (with any internal format texture) ,so does that mean that I can’t access textures from vertex shader :confused: and how can I be sure if yes in the mean of current Gf I am working on … :confused:

Thanks for advance for any suggestions :sleeping:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.