Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 4 of 4

Thread: OpenGL C++ Texture Loading Troubles

Hybrid View

  1. #1
    Newbie Newbie
    Join Date
    Oct 2012
    Posts
    1

    OpenGL C++ Texture Loading Troubles

    As the title suggests, I'm having a hard time loading textures with OpenGL, specificly in Windows (XP). I'm basicly just using thecplusplusguy (youtube.com/thecplusplusguy)'s code for loading textures, found here. It compiles runs fine in Ubuntu Linux, as well as Linux Mint, but when I tried to compile it in Windows, I got this error: "GL_UNSIGNED_INT_8_8_8_8 was not declared in this scope", so I put "#define GL_UNSIGNED_INT_8_8_8_8 0x8035" at the top of my file. That compiled fine, but when I ran my program, where my texture should be on my plane (GL_QUADS) was just completely white. So I tried to #include <GL/glext.h>, but that had the exact same effect, just a white plane. I tried just taking away the .jpg image (I also tried the .bmp image format) from the folder where my program was in, and as expected, the program crashed, which I guess is a good thing. I also tried the texture loading function here, and again, just a white plane. I'm sorry if I'm using bad terminology or anything, as I am a C++ and openGL noob. Any help would be awesome, as I am completely stuck, and I'm on a bit of a deadline (this is a project for school). Thanks in advance,

    Peter

  2. #2
    Junior Member Newbie
    Join Date
    Apr 2010
    Posts
    11
    Windows' OpenGL headers are stuck at version 1.1, so the first thing I'd do is solve that problem. While your missing #define was easy to fix, there'll be a lot more of this kind of thing if you don't.

    I'd suggest GLEW, http://glew.sourceforge.net/, since it will give easy access to everything, including extensions. #include "glew.h" instead of "gl/gl.h", then just call glewInit() after you first call wglCreateContext().

    Regarding your specific problem, white textures often occur when the texture is incomplete (when using mipmaps, which you aren't) or an error occurred during uploading. You can try calling glGetError() after glTexImage2D() or run it under gDEBugger (http://developer.amd.com/tools/hc/gD...s/default.aspx) to see what's going wrong. In gDEBugger you can also break the program and inspect the texture as loaded into OpenGL to make sure that it's a texture loading problem and not a rendering problem.

    I've never used SDL nor have I used GL_UNSIGNED_INT_8_8_8_8 before but it might be worth trying GL_BGRA or GL_RGBA instead (depending on the colour order SDL is using). Another thing to look out for is packing and alignment (set with glPixelStorei() using GL_UNPACK_ALIGNMENT and GL_UNPACK_ROW_LENGTH).

  3. #3
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    I've never used SDL nor have I used GL_UNSIGNED_INT_8_8_8_8 before but it might be worth trying GL_BGRA or GL_RGBA instead (depending on the colour order SDL is using).
    Instead? GL_RGBA/BGRA specify the pixel transfer format. GL_UNSIGNED_INT_8_8_8_8 is the pixel transfer type. You can't replace one with the other.

  4. #4
    Junior Member Newbie
    Join Date
    Apr 2010
    Posts
    11
    Quote Originally Posted by Alfonse Reinheart View Post
    Instead? GL_RGBA/BGRA specify the pixel transfer format. GL_UNSIGNED_INT_8_8_8_8 is the pixel transfer type. You can't replace one with the other.
    Oops, sorry -- long day battling with OpenGL fried my brain. What I should have said was "Try GL_UNSIGNED_BYTE instead".

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •