Help with Texture formatting

I’m trying to take a texture out of an IplImage structure. In the following code, try to ignore the OpenCV functions; I only included them to illustrate what I’m trying to do. However, the texture I come up with keeps displaying as a white image. Of all the example code I’ve seen, it always depicts loading in a picture from a file. So knowing that I’m trying to pass in pieces of a pre-made structure as a texture, am I missing something fundamental in my code?

//Cv Functions for working with a webcam
cvGrabFrame( capture );
frame = cvRetrieveFrame( capture );

//Resizes the IplImage frame into camTexture which is of a preset size (power of 2).
cvResize(frame, camTexture);

//Corrects flipping caused by the copy.
cvFlip(camTexture, NULL, 0);

glBindTexture(GL_TEXTURE_2D, 1);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, camTexture->width, camTexture->height, 0, GL_RGB, GL_UNSIGNED_BYTE, camTexture->imageData);

explantions of code-snipets (at least, what I THINK they do):
camTexture->height : Part of the IplStructure indicating the height of the image (int).
camTexture->width : ditto… only for width.
camTexture->imageData : A “pointer to the aligned image date” (char*).

Any advice or insights? Anybody?

Specify a GL_RGB8 instead of GL_RGB for your internal texture format.
What does glGetError() return?
Did you enable texturing glEnable(GL_TEXTURE_2D)?
Did you specify filtering and texture environment parameters correctly(glTexParameter/glTexEnv)?

[This message has been edited by roffe (edited 02-03-2004).]

Switched the internal format to GL_RGB8 as recommended, no change. I also changed the input format to GL_BGR_EXT, as the CV functions (apparently) store the image in BGR format.

I do indeed engage the texturing with glEnable(GL_TEXTURE_2D). I run it inside a loop, and enable before drawing each iteration of the look, and disable after the loop is done.

Before and after the glTexImage2D call, glGetError() returns 0 in both cases.

Finally, my texture initializations are as follows:
glBindTexture(GL_TEXTURE_2D, 1);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);

What does glGetError return after rendering? That is after glEnd() or glDraw*()

What is the size of the texture? Is it a power of two in width and height?

If you suspect that your texture loading code is failing create a test texture by hand that you know should show up. I use the checker texture myself: http://www.sgi.com/software/opengl/examples/redbook/source/checker.c

[This message has been edited by roffe (edited 02-04-2004).]

glGetError() still returns 0.

The test-program you linked to runs perfectly, but I should also make clear that the project I am making is a Win32 App, and therefore, I am (regretfully) having to avoid GLUT altogether. I doubt that makes a difference, but I felt I should mention it anyways.

However, I am going to take the code for ‘checker’, and alter it to attempt to do the same thing I’ve been trying all along. If THAT works, I’ll know for a fact that I’m messing something up in the initializations… or else I just don’t know what I’m doing.

I’ll post the results of my efforts here (including complete functions I have altered) when I’m done

Originally posted by theUnknowing:
The test-program you linked to runs perfectly, but I should also make clear that the project I am making is a Win32 App, and therefore, I am (regretfully) having to avoid GLUT altogether.

Oh sorry for the confusion, I didn’t mean that you should run the app. In the app they create a temp texture by hand. Just copy and paste that code into yours to see if it works.

Well, after staring at the code for two hours, picking through the memory-dump display in hexadecimal, viewing the contents of variables, arrays, and my soul, I have FINALLY got the setup working, due in large part to the code that you gave me as a template. Here is what the problem was (this is one of those “DOH” moments):

MY CODE:

glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glTextImage2D(…stuff…);

SAMPLE CODE:

glTextImage2D(…stuff…);

glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

Once I put these two lines AFTER my texture is declared, the image is rendered perfectly as a texture and displayed in just the way I want. So for future reference, these lines ARE initializations, but they initialize the textures that are ALREADY LOADED.

I’m still not COMPLETELY clear on the purpose of these functions (nobody EVER writes documentation for the layman), but it is apparent that they are routines to set up a texture that is already loaded into memory. Due to this, and since I am continuously loading a new texture into memory (for every loop iteration, a new frame is grabbed from the webcam and made a texture), I must continuously call these initializations as well.

Sometimes it’s the easy, stupid thing that makes a program non-functional. :stuck_out_tongue:

In any case, thank you VERY much for your help and input. It made my task much easier (hm… 3 days to track down a mis-ordering of code). And just so you know, of all the places I’ve put out help questions (OpenCV Yahoo forum, Nehe’s OpenGL forum, and here) you are the ONLY person to try and help me.

shakes hand Thank you.

theUnknowingALittleLess

HMMMM, In my code there is no problems if you specify min/mag filters before creating texture. Even more, if you specify min filter as mipmap you must call gluBuildMipmaps2D or use another parameter SGIS_GENERATE_MIPMAP(S?). It seems wrong to call parameters after specifying texture
Because otherways following code is vvverry valid
gluBuilDMipmaps…
glTexPar…(minfilter->nearest) //doesn’t make sense

Must check the docs

Well, it turns out the REAL trick is that I’m running glTexImage2D in a loop, so the initializations DO have to be called for each and everytime I run that function. But I’ve since been told that if I want to avoid doing that, I should look into using glTexSubImage2D instead.

I havn’t read up on it yet, but from the name, I assume it takes a previously created & initialized texture, and replaces it with another one with the same settings. It figures that there’d be a pre-made function to do exactly what I want, only I didn’t know about it, so I ended up doing it the hard way.

Oh well. Live and learn, or die in ignorance.

Originally posted by theUnknowing:
places I’ve put out help questions (OpenCV Yahoo forum, Nehe’s OpenGL forum, and here) you are the ONLY person to try and help me.

No problem. Next time you have a problem, see if you can help someone else solve a problem while you’re here.

[This message has been edited by roffe (edited 02-05-2004).]

Originally posted by theUnknowing:
[b]<…>
Here is what the problem was (this is one of those “DOH” moments):

MY CODE:

glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glTextImage2D(…stuff…);

SAMPLE CODE:

glTextImage2D(…stuff…);

glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

Once I put these two lines AFTER my texture is declared, the image is rendered perfectly <…>[/b]
This is wrong implementation behaviour. You must execute BindTexture before being able to modify ‘your’ texture object. But that’s it.

What graphics card are you using?

In answer to your question, I do call BindTexture in my code; I just left it out of that sample.

However, I’ve since had another revelation that apparently is the ULTIMATE cause for my problems. Once again, a doh moment.

I used a template for a Win32 Application that I took out of some sample code. On the whole, it’s pretty simple, and makes it very easy to jump into coding without having to worry about the semantics of creating a window, or dealing with messages.

To make the window obejct that is created compatible with OpenGL, the writer of the template created a quick function-call that properly configures the window for rendering.

And here comes the stupid little problem; all the initializations I was running to prepare for textures (not to mention depth-buffering, etc), were called BEFORE the window was GL initialized. So when the window WAS setup, my settings were either ignored, or altered to point at NULLs.

This one goes out to all the others who have spent 3 - 4 days tracking down a line-ordering problem in your code;

COMMENT YOUR DAMN CODE. Saves on embaressment.

theUnknowing
“Engineering: Keeping the beer population in check for the past 100 years.”

P.S. My graphics card is a nVIDIA GeForce 4 AGP x8