PDA

View Full Version : A Problem...



MrShoe
07-27-2001, 07:14 PM
Ive been always programming OGL in C, now Im making the move to C++, but there is a problem. When I try to compile my old proggies in C++ it gives me the following error:
passing `int *' as argument 2 of `glGenTextures(int, GLuint *)' changes signedness

this corresponds to the line:
glGenTextures(1, &texture[0]);

any ideas?
thanks.

ffish
07-27-2001, 09:19 PM
Maybe your C++ compiler has stronger type checking than your C compiler. Looks like it's because the function is expecting the OpenGL typedef'd equivalent of an unsigned int while you're trying to pass an int. Just change your texture name array to be of type GLuint (or unsigned int).

One more hint - I always surround my OpenGL headers with the following:


#ifdef _cplusplus
extern "C" {
#endif
#include <GL/...>
#ifdef _cplusplus
};
#endif
to handle the C calling convention in the OpenGL headers.

Hope that helps.

Morglum
07-28-2001, 12:19 AM
ok ok
your are trying to call
glGenTextures(1, texture);
where texture is of "int *" type.
But glGenTextures would like a second argument of "unsigned int *" type. You have to make a conversion : just replace your
glGenTextures(1, &texture[0]);
by
glGenTextures(1, (unsigned int *)texture);

If it does not work, then try to declare texture with
unsigned int texture [NumTex];
instead of the
int texture [NumTex];
that you are probably using.

I hope that this will work

Michael Steinberg
07-28-2001, 08:29 AM
I don't think it's good to typecast a unsigned pointer to a signed pointer, since the underlying data won't be converted (it's not clean anyway, it would probably work). I would instead change the type of the pointer as proposed before.

MrShoe
07-28-2001, 05:18 PM
Ah, thanks all! I changed
int texture[1];
to either
GLuint texture[1];
or
unsigned int texture[1];
both work, thanks again!