Problems with rendering to a texture...

I want my texture class to support rendering to a texture… here is the basic code i have so far:

void CTexture::initRenderToTexture(dimension2di size1) {
		size = size1;
		glGenTextures(1, &id);
		glBindTexture(GL_TEXTURE_2D, id);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
		glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 0, 0, size.width, size.height, 0);
	}

	void CTexture::renderToTexture() {
		glBindTexture(GL_TEXTURE_2D, id);
		glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, size.width, size.height);
	}

But, this doesn’t work… I have written code for doing this before, but I don’t know what I’ve done wrong this time… but by looking at my code it feels like im forgeting something obviouse…

what am i doing wrong?

Not specifying the format of the data you are copying is probably causing you a problem. Your third parameter should be GL_RGB.

If you call glGetError() then OpenGL would probably tell you this.

Your glCopyTexSubImage2D call looks ok. It assumes a texture format has already been chosen/is present, no format parameter needed. For your glCopyTexImage2D call I would specify GL_RGB8 as the internal format or whatever it is you use.

And you did enable texturing?

What do you mean by “it doesn’t work”?
Are you getting an empty texture, is it skewed, corrupted…?

things you could check:

-are you reading from the correct buffer ReadBuffer()
-pixelstore parameter pack alignment for non multiple of 4 width

P.S. I noticed you pass size1 as a parameter to your initRenderToTexture function, but you use size in the implementation.

Nico

Why do you copy the image both on “init” and on “update”? It would seem that you only need to copy the data on “update”. I also don’t see you actually creating the texture levels anywhere – that first image copy is likely to fail. Perhaps you meant “glTexImage2D()” at that point?

That kind of error is easily caught if you litter your code with “assert(!glGetError());” absolutely EVERYWHERE! Putting this at the beginning and end of each function that calls GL is a good start.

Originally posted by jwatte:
I also don’t see you actually creating the texture levels anywhere – that first image copy is likely to fail. Perhaps you meant “glTexImage2D()” at that point?
It looks perfectly fine to me. Remember that glCopyTexImage2D IS glTexImage2D but it takes pixel data from the frame buffer instead of system mem.

You’re right; I read that as CopyTex_Sub_Image, which was not what it actually says.

Meanwhile, I think a bunch of assert(!glGetError()) calls would still help find issues :slight_smile:

What I get is just a blank black texture… The code I posted is the only code I have for that part of my texture class for rendering to texture… I feel like I’m missing something… :confused:

Hmm, power of two sized texture??

Originally posted by jwatte:
Meanwhile, I think a bunch of assert(!glGetError()) calls would still help find issues :slight_smile:
For sure, I use it all the time nowadays after I saw one of your postings. It’s a life saver. Not only does it catch the obvious mistakes, but also helps finding inconsistencies within driver versions. One driver version might silently accept faulty code while another fails it.

Originally posted by jwatte:

Meanwhile, I think a bunch of assert(!glGetError()) calls would still help find issues :slight_smile:

How would I do that?