OpenGL 3.0 and writing to an integer framebuffer

I’m having a hard time writing to an integer framebuffer using shaders. I’m able to create the integer texture just fine, but the problem is how to write the pixel shader to output correct values.

Here’s my initialization code:


        glGenTextures(1, &tex);
	glBindTexture(GL_TEXTURE_2D, tex);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA32UI_EXT, gltClientWidth(), gltClientHeight(), 0, GL_ALPHA_INTEGER_EXT, GL_UNSIGNED_INT, 0);

	glGenFramebuffersEXT(1, &fbo);
	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fbo);
	glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, tex, 0);

	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

Here are my shaders:

Vertex


#version 130

void main()
{
	gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}

Fragment


#version 130

out uint gl_PixelColor;

void main()
{
    gl_PixelColor = 25;
}

And here’s my rendering code.


        glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fbo);
	glClearColorIuiEXT(0, 0, 0, 0);
	glClear(GL_COLOR_BUFFER_BIT);

	glUseProgram(program);
	
	float colors[] = {
		1.0f, 0.0f, 0.0f,
		0.0f, 1.0f, 0.0f,
		0.0f, 0.0f, 1.0f
	};
	
	float vertices[] = {
		0.0f, 0.0f,
		100.0f, 0.0f,
		100.0f, 100.0f,
		0.0f, 100.0f
	};

	glEnableClientState(GL_COLOR_ARRAY);
	glColorPointer(3, GL_FLOAT, 0, colors);

	glEnableClientState(GL_VERTEX_ARRAY);
	glVertexPointer(2, GL_FLOAT, 0, vertices);

	glDrawArrays(GL_QUADS, 0, 4);

So what I would expect is for the bound integer alpha texture to contain a value of 25 wherever the quad was drawn. But instead, the entire texture stays completely black.

I know a lot of things became deprecated with GLSL 1.3, but I can’t seem to find good documentation on how to write shaders correctly, ect…

So overall, all I want to do is write values to an integer texture using pixel shaders.

If you don’t use forward compatible context then you don’t have to worry about deprecated functions. They should work fine in full context.

Try binding gl_PixelColor to draw buffer with glBindFragDataLocation function (page 236 in gl3 spec). I don’t know if it works without binding by default. Also try calling glDrawBuffer(GL_COLOR_ATTAHCMENT0) before drawing.
And check FBO for completness, maybe it doesn’t support GL_ALPHA32UI texture as attachment, also check for other OpenGL errors with glGetError function.

Are you calling glBindFragDataLocation somewhere? I think you need to call that to bind your shaders output variable to some render target. In gl 2.1 this seems to be optional in most situations, but in gl 3.0 this is mandatory AFAIK.

Jan.

You have to use glBindFragDataLocation to direct integer outputs to a buffer.

Thanks guys! I’ll try your suggestions.

Hmm, still no luck. My texture is still all black. I’m calling glBindFragDataLocation right before linking my shader program. I’ve also tried calling it during rendering.

init code


	char* vshaderSource = fileio::readAllText("shader.vs");
	char* fshaderSource = fileio::readAllText("shader.fs");

	vshader = glCreateShader(GL_VERTEX_SHADER);
	glShaderSource(vshader, 1, (const GLchar**)&vshaderSource, 0);
	glCompileShader(vshader);

	fshader = glCreateShader(GL_FRAGMENT_SHADER);
	glShaderSource(fshader, 1, (const GLchar**)&fshaderSource, 0);
	glCompileShader(fshader);

	program = glCreateProgram();
	glAttachShader(program, vshader);
	glAttachShader(program, fshader);

	glBindFragDataLocation(program, 0, "myIntOutput");

	glLinkProgram(program);

	printShaderInfoLog(vshader);
	printShaderInfoLog(fshader);
	printProgramInfoLog(program);

rendering


	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fbo);
	glClearColorIuiEXT(0, 0, 0, 0);
	glClear(GL_COLOR_BUFFER_BIT);

	glDrawBuffer(GL_COLOR_ATTACHMENT0_EXT);
	glUseProgram(program);	

	float colors[] = {
		1.0f, 0.0f, 0.0f,
		0.0f, 1.0f, 0.0f,
		0.0f, 0.0f, 1.0f
	};
	
	float vertices[] = {
		0.0f, 0.0f,
		100.0f, 0.0f,
		100.0f, 100.0f,
		0.0f, 100.0f
	};

	glEnableClientState(GL_COLOR_ARRAY);
	glColorPointer(3, GL_FLOAT, 0, colors);

	glEnableClientState(GL_VERTEX_ARRAY);
	glVertexPointer(2, GL_FLOAT, 0, vertices);

	glDrawArrays(GL_TRIANGLES, 0, 3);


fragment shader


#version 130

out uint myIntOutput;

void main()
{
	myIntOutput = uint(25);
}

I was forced to place the version preprocessor token at the top b/c GLSL kept complaining that “out” wasn’t supported in 120 or under.

How are you setting up modelview and projection matrices? Maybe vertices transformed by them gets culled away.

Isn’t gl_ModelViewProjectionMatrix supposed to be deprecated in glsl 1.3? Or only the matrix stacks ?

If I turn off the shader everything works fine. Its definitely the shader.

void BindFragDataLocationEXT(uint program, uint colorNumber,const char *name);

glBindFragDataLocation(program, 0, “myIntOutput”);

Maybe 0 is not a correct colorNumber. Printing the opengl error after glBindFragDataLocation could be help.

The error code is 1280, but the string returned by glewGetErrorString returns “Unknown Error”.

Try using DrawBuffers enum value instead of 0?

Well I take back the error. That 1280 was being caused by my call to glDisable(GL_CLAMP_FRAGMENT_COLOR), b/c I thought that’s how it was suppose to be called.

I commented it out and now no errors are reported. Yet my texture remains black.

But I did try what you said with using DrawBuffers, but a 1281 error is reported immediately after. I’ve tried the following:

glBindFragDataLocation(program, GL_DRAW_BUFFER, “myIntOutput”);
glBindFragDataLocation(program, GL_DRAW_BUFFER0, “myIntOutput”);

Well I changed my texture format from GL_ALPHAUI32_EXT to GL_RGBAUI32_EXT and my texture is filled with the correct integer RGBA values. But I really need this to work for a single channel texture.

Any suggestions? And thanks to everybody for all your help so far.

Well I discovered using GL_LUMINANCE32UI_EXT instead of GL_ALPHA32UI_EXT worked. Thanks again for everybody’s help!

Maybe R32UI is better than LUMINANCE32UI for forward compatibility.

It is deprecated only in forward compatbile context. In full 3.0 context you can use whatever you want - even fixed functionality rendering.

0 ir valid argument for colorNumber for glBindFragDataLocation function. It means it will write to 0 attachment of FBO. If it would expect enum there then function signature would include GLenum type, not uint.

Hang on… for integer textures you need to change your texture parameters according to the spec EXT_TEXTURE_INTEGER

void TexParameterIivEXT( enum target, enum pname, int *params );
void TexParameterIuivEXT( enum target, enum pname, uint *params );

You are setting up the textures with plain old normalised glTexParameteri.

Secondly, what makes you think EXT_FBO actually supports integer textures. I have checked ARB_FBO and that definately does…but reading the EXT_Integer_Texture I’m not so sure that it applies to EXT_FBO.

TexParameterIivEXT and TexParameterIuivEXT are needed only for GL_TEXTURE_BORDER_COLOR pname to specify color with integer values.

I encountered the same problem in my program, the difference is that I used cg shader:

glTexImage1D(GL_TEXTURE_1D, 0, GL_ALPHA8UI_EXT,8, 0, GL_ALPHA_INTEGER_EXT, GL_UNSIGNED_BYTE, sBitflag);

param = cgGetNamedParameter(program, par);
cgGLSetTextureParameter(param, tex);
cgGLEnableTextureParameter(param);

how to solve this problem?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.