Regarding drawBuffer in glClearBufferiv

I am rendering geometry on non default FBO and applying it as integer texture to the geometry rendered on default FBO. To clear a color on non default FBO, i am using glClearBufferiv() call. I am not getting what is drawBuffer (2nd argument) in my case. I tried with 0, but its not working. it does not clear the color.


GLint cl[]={32000,32000,32000,32767};
glClearBufferiv(GL_COLOR, 0, cl);

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

My frag shader:

out ivec4 fragColor;

void main()
{
fragColor = ivec4(0,32767,0,32767);
}


With this, it renders green geometry but the outside color is black.

The drawbuffer parameter must be the drawbuffer index, i.e. an index into the array of draw buffers specified through glDrawBuffers.

E.g.:

GLenum drawbuffers[] = { GL_COLOR_ATTACHMENT3, GL_COLOR_ATTACHMENT1 };
glDrawBuffers(2, drawbuffers);
glClearBuffer*(GL_COLOR, 0, color); // clears GL_DRAW_BUFFER0, i.e. GL_COLOR_ATTACHMENT3
glClearBuffer*(GL_COLOR, 1, color); // clears GL_DRAW_BUFFER1, i.e. GL_COLOR_ATTACHMENT1

It works the exact same way with the default FBO, just the draw buffer enums are different there.

E.g.

GLenum drawbuffers[] = { GL_FRONT, GL_BACK };
glDrawBuffers(2, drawbuffers);
glClearBuffer*(GL_COLOR, 0, color); // clears GL_DRAW_BUFFER0, i.e. GL_FRONT
glClearBuffer*(GL_COLOR, 1, color); // clears GL_DRAW_BUFFER1, i.e. GL_BACK

You probably forgot to call glDrawBuffers.

1 Like

[QUOTE=aqnuep;1248283]The drawbuffer parameter must be the drawbuffer index, i.e. an index into the array of draw buffers specified through glDrawBuffers.

E.g.:

GLenum drawbuffers[] = { GL_COLOR_ATTACHMENT3, GL_COLOR_ATTACHMENT1 };
glDrawBuffers(2, drawbuffers);
glClearBuffer*(GL_COLOR, 0, color); // clears GL_DRAW_BUFFER0, i.e. GL_COLOR_ATTACHMENT3
glClearBuffer*(GL_COLOR, 1, color); // clears GL_DRAW_BUFFER1, i.e. GL_COLOR_ATTACHMENT1

It works the exact same way with the default FBO, just the draw buffer enums are different there.

E.g.

GLenum drawbuffers[] = { GL_FRONT, GL_BACK };
glDrawBuffers(2, drawbuffers);
glClearBuffer*(GL_COLOR, 0, color); // clears GL_DRAW_BUFFER0, i.e. GL_FRONT
glClearBuffer*(GL_COLOR, 1, color); // clears GL_DRAW_BUFFER1, i.e. GL_BACK

You probably forgot to call glDrawBuffers.[/QUOTE]

Thanks for your reply.
I did this:


GLenum buffers[] = { GL_COLOR_ATTACHMENT0};
glDrawBuffers(1,buffers);
GLint cl[]={32000,32000,32000,32767};
glClearBufferiv(GL_COLOR, 0, cl); 


I am binding non default fbo and then while rendering on it i put above calls. My non default FBO has color (GL_COLOR_ATTACHMENT0) and depth attachment.
I dont understand what is wrong with above code… Is it a driver bug?

From what you’ve described it could be a driver bug (assuming there are no errors in the rest of the code).

What is that glClear call with GL_COLOR_BUFFER_BIT included doing in your original post? Was that just a copy/paste error?

Ok, i got it… I removed glClear(GL_COLOR_BUFFER_BIT); and now it is working. I dont get it i was thinking that glclearbuffer will only set values for clearing buffer but actually glClear(GL_COLOR_BUFFER_BIT); it will clear out color values.

Now, there is one problem. I have GL_RGBA16I as internal format type. now when i clear buffer with {0,50,50,32767} it sets G B bits to max values even if they are 50. What is problem here?

The problem is how you’re reading the data. So… how are you reading the data?

I am doing this stuff on newly created FBO and attaching it color attachment. I am applying this texture on geometry rendered on system default FBO.
So basically, I have these calls:


glTexImage2DMultisample(GL_TEXTURE_2D_MULTISAMPLE, 1 ,GL_RGBA16I, 32, 32,true);
glFramebufferTexture2D(GL_DRAW_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D_MULTISAMPLE,id1,0); 


and frag shader mentioned above for rendering on newly created FBO.

There seems to have been a failure to communicate.

You are rendering to an integer texture. Then you are reading from that integer texture, either via a sampler in a later rendering operation or with glReadPixels/glGetTexImage.

I’m not asking about how you’re setting up your FBO and rendering to it. I’m asking about how you’re reading the data and thus determining that “it sets G B bits to max values” has happened. I want to see the code wherein you are making that determination.

[QUOTE=Alfonse Reinheart;1248322]There seems to have been a failure to communicate.

You are rendering to an integer texture. Then you are reading from that integer texture, either via a sampler in a later rendering operation or with glReadPixels/glGetTexImage.

I’m not asking about how you’re setting up your FBO and rendering to it. I’m asking about how you’re reading the data and thus determining that “it sets G B bits to max values” has happened. I want to see the code wherein you are making that determination.[/QUOTE]

ok, here is my frag shader for reading that texture:


uniform isampler2DMS tk_diffuseMap;
in vec3 ps_texCoord;
out vec4 fragColor;
uniform int samples;
uniform int factor; 
void main(void)
{

	vec2 iTmp = textureSize(tk_diffuseMap);
	vec2 tmp =floor(iTmp * ps_texCoord.xy);
	ivec4 temp;	 
	vec4 temp1;
	vec4 color;
	
	for(int i = 0; i < samples; ++i)
	{
		 temp= texelFetch(tk_diffuseMap, ivec2(tmp), i);
		 temp1=vec4(temp);
		 temp1=temp1/32767;
		color = color +	temp1;
	}
 
	fragColor = vec4(color/samples);
}


i am dividing that by 32767 for normalizing the data as my type is GL_RGBA16I…

You’re not initializing color. So you end up with a garbage + temp1 = garbage problem.

Don’t know if that’s your complete problem. It just jumped out at me.

[QUOTE=Dark Photon;1248365]You’re not initializing color. So you end up with a garbage + temp1 = garbage problem.

Don’t know if that’s your complete problem. It just jumped out at me.[/QUOTE]

Thats not an issue as i tried by initializing color variable in shader.