glGenTextures touching my memory?

Hello,
I’ve got a somewhat annoying problem.
I’m currently working on a class that parses a script and then draws a menu depending on how your script looks like.
Parsing my script seems to work fine, but one variable has the value 0 instead of 5(I have the value 5 in my script).
So after some debugging it seemed to parse the configuration file just fine, but the variable is somehow ‘reset’ after the glGenTextures call.
Code:

HAlignment: 5
HAlignment: 0

Parsing code:

	else if(input.find("HorizontalAlign") != string::npos && input.find("(") > input.find("HorizontalAlign"))
	{
		Align alignment = getAlignment(input);
		if(alignment.hAlignNum == SPECIFY_COORD)
		{
			menuAlignment.hAlignNum = SPECIFY_COORD;
			menuAlignment.left = alignment.left;
		}
		else
			menuAlignment.hAlignNum = alignment.hAlignNum;
	}

I don’t know wheter it’s my code or OpenGL that is causing the problem.
But the only time I set the variable is during parsing.
The other times it’s just comparing(which is after the glGenTextures call).
Could anyone help me in the right direction how to fix this?
Thanx Hylke

perhaps in one of your comparisons you accidentally used = instead of == ?

glGenTextures shouldn’t be doing anything to your variable.

try cleaning and rebuilding your entire project.
perhaps some symbols got mixed up or something else wacky.

…just guessing.

Right now i am writing my own memory manager and let me tell you, there are plenty of ways you can get such results. It doesn’t need to be visible in the code (= instead of ==), you might write over the boundaries of some array or such.

Of course, it COULD be possible, that glGenTextures does something bad, but in all the years i use OpenGL now, i have never had such a problem, so i would say it is save to assume, that drivers are error free in that aspect.

Most certainly it’s somewhere in your code. And, yes, everybody thinks “it can’t be my code!” once in a while and a bit later you find out “oh, it WAS my code…”.
It’s frustrating though.

Jan.

glGenTextures(numTextures, &texIndex);
what’s numTextures set to, and what IS texIndex?

Watch!!!
If you create only one texture you should use
glGenTextures(1, &texIndex).

If you create many textures texIndex should be array of GLuints and do:
glGenTextures(numTextures, texIndex). Note… there is no any & (address of)

And finally, check for you array len and value of numTextures.

yooyo

Originally posted by yooyo:
[b]Watch!!!
If you create only one texture you should use
glGenTextures(1, &texIndex).

If you create many textures texIndex should be array of GLuints and do:
glGenTextures(numTextures, texIndex). Note… there is no any & (address of)

And finally, check for you array len and value of numTextures.

yooyo[/b]
I think you spotted something interresting.

However, this might not be true all the time, and specially when using references:

GLuint tid[MAX]; // or any other array allocation
GLuint& a_tid = tid[0];

glGenTextures (nb_to_generate, &a_tid);

I know this is tricky (or bended :wink: but this sort of syntax ensures your code works for any number of textures to be generated. Also, this works only under C++, not C (obviously).

Anyway, another more interresting thing would be to use arrays all the time so that we use something like this:

GLuint tid[nb];
glGenTextures (nb, tid);

This sort of code is indeed less exposed to errors.

Hylke:

If you’re using visual studio, you can put a breakpoint on the data location. It’ll break at the code that’s trying to change your variable.

If you’re using VS.net, go under the “New Breakpoint” dialog box, under the “Data” tab, set:

Variable: whatever the hex value of &menuAlignment.hAlignNum
Items: 1
Context: (make sure this is blank).

press OK, and continue running the application.

Make sure you put the breakpoint at the memory address of menuAlignment.hAlignNum. You can get this by adding &menuAlignment.hAlignNum to your watches window.

If you’re on linux, try using GDB, although I wouldn’t be of much help to you there.

Hope this helps.

Originally posted by Aeluned:
[b]perhaps in one of your comparisons you accidentally used = instead of == ?

glGenTextures shouldn’t be doing anything to your variable.

try cleaning and rebuilding your entire project.
perhaps some symbols got mixed up or something else wacky.

…just guessing.[/b]
I’ve searched through hall the file but I can’t seem to find something that could do anything to menuAlignment.

menuAlignment is not an array and doesn’t have any member arrays, so that can’t be the problem too.

texIndex is a GLuint, and the value of numTextures depends on your script, but in my current script it’s 3.

Watch!!!
If you create only one texture you should use
glGenTextures(1, &texIndex).

If you create many textures texIndex should be array of GLuints and do:
glGenTextures(numTextures, texIndex). Note… there is no any & (address of)

And finally, check for you array len and value of numTextures.
I just have:
GLuint texIndex;
and with every other texture I just do:
texIndex+i(where i is not bigger than numTextures-1)

I’m using linux and indeed gdb did not give me much information because it didn’t crash.
I also tried valgrind, and he gave me thousands of lines of errors of my nvidia driver(I can post some if it’s important).
Hylke

So, you’re basically corrupting your memory.

If you declare a single GLuint, you can only give a maximum of 1 to glGenTextures.

If you have a maximum of maxN textures, you should declare an array of N GLuint (with N <= maxN), that way:

GLuint texIndices[maxN];
glGenTextures(N, texIndices);

By the way, this is not an advanced OpenGL question, but a basic C/C++ one.

Y.

I can’t believe it.
That fixed the bug.
Thank y’all very much.
Hylke

Holy mother of moses.

indeed.

i see the advanced forum has gone to hell in a handbasket…

it’s not the donker fella that makes me despair, he’s just thick - it’s aeluned, jan, jide and canuckle!
How many gl.org contributors does it take to change a light bulb? - if it were left up to those 4 they’d have re-wired the house before realising the bulb was gone.

GL_LIGHT … _BULB?

GLuint bulb;
glDisable(GL_LIGHT0); // always turn off the light, so you won’t get electrified :wink:
glGenLightBulb(1, &bulb);
glLightParameteri(GL_LIGHT0, GL_LIGHT_BULB, bulb);
glEnable(GL_LIGHT0);

:smiley:

Originally posted by knackered:
it’s not the donker fella that makes me despair, he’s just thick - it’s aeluned, jan, jide and canuckle!
How many gl.org contributors does it take to change a light bulb? - if it were left up to those 4 they’d have re-wired the house before realising the bulb was gone.

Ahah you’re so funny Knackered. But I’m pretty sure you never changed a light bulb from your life: this smells blond brained (from the obviously known joke). anyway I’m proud not having the same logic as you :smiley:

When reading your post, I can’t figure out how you passed from the first paragraph to the second one: there’s absolutely no logical link at all !!

PS: stop posting out of topic stuffs. We’re not here to know your feelings.

Hey jide, don’t take offense - you didn’t approach the problem in the best way, you missed the obvious, so I made a friendly joke about it - take it on the chin like a man.

:wink: