Frag shader without vertex shader?

Hi,

I want to write a shader program that has one instance of a fragment shader and no vertex shader. My understanding is that this should work and I should be able to use the default version of the vertex pipeline.

However, in tests using GLSL, it seems values like gl_FragCoord are not set properly. I wonder if its a driver problem or my misunderstanding or if there is something I must do to make this work (extra assumptions I have wrong).

Can someone clarify if I misunderstand or if drivers are not implemented correctly? It is not an option to work from Cg.

Taj

Are you sure you’re not mixing up gl_FragCoord and gl_TexCoord?

N.

Yes Im sure. gl_FragCoord gives bad values if I dont have a vertex shader…

Works just fine on my card… Can you post some code and/or screenshot of the wrong values?

N.

Nico do you have Nvidia card or ATi? I have ATi. If you have ATi, I’ll assume I’m doing something wrong and investigate.

No I’m checking and checking and somethings wrong or I’m being dumb.
Please assume I have an OpenGL window and rendering loop. I draw a glRecti(-1,-1,1,1) to fill the screen.

Here are my shaders:
[b]


// vertex...
char *vsh="void main(){ gl_Position=ftransform(); }"; 
// fragment...
char *fsh="void main(){ gl_FragColor=gl_FragCoord/1024.0; }"; 

[/b]

Here is my shader setup routine:
[b]


void compileShader() {
GLuint s,p;
	    
   p = ((PFNGLCREATEPROGRAMPROC)wglGetProcAddress("glCreateProgram"))();
		
   s = ((PFNGLCREATESHADERPROC)(wglGetProcAddress("glCreateShader")))(GL_VERTEX_SHADER);
	  ((PFNGLSHADERSOURCEPROC)wglGetProcAddress("glShaderSource")) (s, 1, (const GLchar**)(&vsh), NULL);
	    ((PFNGLCOMPILESHADERPROC)wglGetProcAddress("glCompileShader"))(s);
	    ((PFNGLATTACHSHADERPROC)wglGetProcAddress("glAttachShader")) (p,s);
		
   s = ((PFNGLCREATESHADERPROC)wglGetProcAddress("glCreateShader"))(GL_FRAGMENT_SHADER);	
		((PFNGLSHADERSOURCEPROC)wglGetProcAddress("glShaderSource")) (s, 1, (const GLchar**)(&fsh), NULL);
		((PFNGLCOMPILESHADERPROC)wglGetProcAddress("glCompileShader"))(s);
		((PFNGLATTACHSHADERPROC)wglGetProcAddress("glAttachShader")) (p,s);

		((PFNGLLINKPROGRAMPROC)wglGetProcAddress("glLinkProgram"))(p);
		((PFNGLUSEPROGRAMPROC) wglGetProcAddress("glUseProgram"))(p);
}

[/b]

(sorry for rough formatting)

Now if I run it like that I get whats expected… a gently graded colour across the screen in red and green (no blue)…correct.

If I comment out the four lines that create and attach the vertex shader, I get just green on the screen - a flat colour, no gradation. Its clearly wrong. It seems gl_FragCoord is not set correctly if I have no vertex shader.

I’m using default projection and identity for modelview. I’m running under windows XP SP2 with an ATi x1950.

I tested on an Nvidia Geforce Go 7600 and an Nvidia Quadro FX 1600M. Don’t have an ATI here to test it on, sorry…

N.

Beginning to look like an ATi driver bug…

I wrote a small glut/glew app for you to try out:


#include <stdio.h>
#include <stdlib.h>
#include <GL/glew.h>
#include <GL/glut.h>

GLhandleARB v,f,p;

void renderScene(void) {

    glClearColor(0.0,0.0,0.0,0.0);
    glClear(GL_COLOR_BUFFER_BIT);

    glUseProgramObjectARB(p);

    glRecti(-1,-1,1,1);

    glUseProgramObjectARB(0);

    glutSwapBuffers();
}

void setShaders() {

    /*
    v = glCreateShaderObjectARB(GL_VERTEX_SHADER_ARB);
    const char *vs = "void main(){ gl_Position=ftransform(); }";
    glShaderSourceARB(v, 1, &vs,NULL);
    glCompileShaderARB(v);
    */

    f = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB);
    const char *fs = "void main(){ gl_FragColor=gl_FragCoord/512.0; }";
    glShaderSourceARB(f, 1, &fs,NULL);
    glCompileShaderARB(f);

    p = glCreateProgramObjectARB();
    glAttachObjectARB(p,f);
    //glAttachObjectARB(p,v);

    glLinkProgramARB(p);

}

void changeSize(int w, int h) {

    glViewport(0, 0, w, h);
}

void processNormalKeys(unsigned char key, int x, int y) {

   switch (key) {
		case 27:
		case 'q':
			exit(0);
			break;
   }
}

int main(int argc, char **argv) {
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
    glutInitWindowSize(512,512);
    glutCreateWindow("GLSL");

    glutDisplayFunc(renderScene);
    glutReshapeFunc(changeSize);
    glutKeyboardFunc(processNormalKeys);

    glewInit();

    if (GLEW_ARB_vertex_shader && GLEW_ARB_fragment_shader)
        printf("Ready for GLSL
");
    else {
        printf("No GLSL support
");
        exit(1);
    }

    setShaders();

    glutMainLoop();

    return 0;
}

This is the output I get:

N.

Some ATI cards do not have hw support for the gl_FragCoord and the driver emulates it using varying calculated by vertex shader. It is possible that when the driver emulates the fixed function pipeline, it forgets to generate that varying even when the fragment shader needs it.

Thanks for the help NiCo. I downloaded glut and glew and created a VC++ project and got it running … and it works.

So now I’m down to one of :

  • Somehow including stdlib affects the result (I cant include stdlib, GLUT seems to require it so in your example I have to use stdlib)
  • My code to set up shaders is wrong (could someone please check my code?)
  • Somehow my window init code is wrong and is affecting the drivers.

Hmm Im not sure how to progress. Anyone got any ideas what might be wrong?

You could try replacing the glut part with your own window init code. Then you can also remove the stdlib header. If it still works you have eliminated two possibilities…

N.

NiCo - its the weekend so I get to code again :wink: Anyway, I must have been an idiot last weekend. Your glew-glut code doesnt work either. Some how I must have recompiled properly last weekend.

So, to be clear. X1950, XP, SP2, recent drivers…
if I comment out your single line which attaches the vertex shader, I get just green on the screen - clearly wrong, but the same result as my code.

So its not the windows setup and its not stdlib and its not the shader setup code… ATi is definitely broken if no vertex shader is provided.

Aha! An update to the February 13th 2008 release of Radeon drivers has cured the problem…seems clear now it was a driver bug.

Woopty-do!

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.