PDA

View Full Version : Only running a vertex shader



CastleBravo
03-12-2011, 03:51 AM
Recently I've been trying to solve one of my problems related to shaders. I've been trying to only run a vertex shader when I render some of my elements, as the pixel shader is not used and so only takes from the fps.
I returned to Cg after I couldn't get my ARB vertex shaders working properly on ATI cards, and now when I only run my Cg vertex shader everything is fine on my Geforce 6 card, but my friend who's using an ATI card is experiencing weird issues.

Some of the texture coords on some elements seem to be fine, but afterwards the texture coordinates are more and more corrupted, and I have no idea what causes this. It previously did not occur when I ran a fragment shader too, but I'd be very glad if I didn't have to return to that. Does anyone have any suggestions on the matter?

DarkGKnight
03-12-2011, 04:33 AM
Given that Cg will convert your shader to ARB assembly for non-nvidia hardware, it is possible that you are experiencing an ATI bug. Use Cg to convert the vertex shader to ARB assembly, then try to tweak it (if you haven't already). Since you are targeting a Geforce 6, I'm guessing your friend is running a legacy ATI card (Radeon X-series), so any OpenGL bugs for those cards won't be fixed.

CastleBravo
03-12-2011, 07:48 AM
Thanks for your fast response. I've pretty much given up, I don't think I'll find a solution to this problem, must be some bug inside ATI's driver itself. I found other ways to optimize code, so hopefully that'll negate the performance loss.

mobeen
03-12-2011, 08:59 AM
just a thought. Could u double check that the gl state is fine by calling assert(glGetError()==GL_NO_ERROR); in your display function. From my experience ATI cards follow specs more strictly compared to NVIDIA and thus something that might run fine on NVIDIA might popup weird behavior on ATI when u might not be following the specs properly.

DarkGKnight
03-12-2011, 03:02 PM
I'm surprised that a simple texture lookup fragment shader/program is giving you low FPS. Are you targeting integrated hardware?

CastleBravo
03-13-2011, 02:37 AM
I'm running this on my Geforce 6. I keep two fragment shaders, one for just texture lookup, the other for fog calculations. I'm reciprocating (fogend-fogstart) on the cpu already and passing the parameter off with the end dist, and using that to calculate fog, but still, non-shader processing is still much faster than shader fragment processing.


just a thought. Could u double check that the gl state is fine by calling assert(glGetError()==GL_NO_ERROR); in your display function. From my experience ATI cards follow specs more strictly compared to NVIDIA and thus something that might run fine on NVIDIA might popup weird behavior on ATI when u might not be following the specs properly.

I'll check that sometime and come back with results, right now I had enough of that part and I'll have to get rid of vector arrays in one of the parts as it seems to be slow.