View Full Version : glGet macros always return zero -- even at start

05-03-2011, 06:53 PM
Hello...I'm trying to use the GLM routines in my own OpenGL program. I can compile and link everything OK but when trying to load an OBJ Wavefront 3D model with textures the program crashes when trying to load an image--the debug sequence reports that the max texture size is zero.

glGetIntegerfv(glGetIntegerv(GL_MAX_TEXTURE_SIZE) or glGet anything is ALWAYS returning zero for ANY parameter -- even at program start. I'd like to be able to use textures so if you have any suggestions, please let me know. If I run the example program (it's in C not C++ as my program is) glGet(GL_MAX_TEXTURE_SIZE) works OK and gives a value of 32768.

I'm using the version of GLM from here:

Dark Photon
05-03-2011, 06:57 PM
Don't cross-post.

Thread redirect: here (http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=296766#Post2967 66)