Windows Bitmaps

I have a texture loader up and running for parsing windows uncompressed BMP image files.

I have quite happily got is up and running for files with 24bit or 32bit colour modes, however I have had no joy with 8 or 16 bit colour modes. I know the files are being parsed properly into memory but don’t think I am specifying the right image format parameters into the glTexImage2D() function. There are 2 of them which specify how the pixel data is structured.

I must admit it is not urgent or pressing to be able to use 8 or 16 bit colour bitmaps, just that I am on a roll at the moment and in the zone and wanted to crack this problem if possible “because I can”. I can class it as digital image processing revision…maybe…

Use DevIL (Developer’s image library, used to be called OpenIL)
http://openil.sourceforge.net

8 bit is not supported by GL. Convert that to 16, 24, or 32

For 16, you can try
glTexImage2D(…GL_RGB5, …, GL_USNIGNED_BYTE_5_6_5, pixels);

but perhaps convert to 24/32

GL_USNIGNED_BYTE_5_6_5 and it’s variants don’t seem to be present in my gl.h file which was supplied wit h Visual Studio 2005. Will I need to update my gl.h?

no, gl.h is never updated but glext.h is
Download from http://opengl.org/registry/
You might also want wglext.h (for windows users) if you want to use certain extensions.

Originally posted by Carl Jokl:
GL_USNIGNED_BYTE_5_6_5 and it’s variants don’t seem to be present in my gl.h file which was supplied wit h Visual Studio 2005. Will I need to update my gl.h?
Also, your graphics card might provide an SDK which could possibly have a more recent gl.h.

I’m pretty sure nVidia does this.

I have sorted this problem now. I do conversions for any files below 24bit. I have support for all uncompressed bitmaps now.

Originally posted by J.R. Hass:
[b] [quote]Originally posted by Carl Jokl:
GL_USNIGNED_BYTE_5_6_5 and it’s variants don’t seem to be present in my gl.h file which was supplied wit h Visual Studio 2005. Will I need to update my gl.h?
Also, your graphics card might provide an SDK which could possibly have a more recent gl.h.

I’m pretty sure nVidia does this. [/b][/QUOTE]No, there is no updated gl.h
It hasn’t changed in 10 years or so
If you want to update, take all the stuff from glext.h and copy to gl.h

Originally posted by Carl Jokl:
GL_UNSIGNED_BYTE_5_6_5 and it’s variants don’t seem to be present in my gl.h file
cannot be. 5bit+6bit+5bit=16bit. one byte only has 8 bit. 16 bit is type SHORT, so there should be GL_UNSIGNED_SHORT_5_6_5

btw, on my system (suse linux 10.0) GL_UNSIGNED_SHORT_5_6_5 is defined in both gl.h and glext.h (both provided by nvidia). gl.h contains definitions for gl 1.4 and has copyright remark for 1999-2002, so it is only 5 years old, not 10 :wink:

Actually, I guess that was my mistake
In glext.h in the 1.2 version section there is the following

#define GL_UNSIGNED_BYTE_2_3_3_REV        0x8362
#define GL_UNSIGNED_SHORT_5_6_5           0x8363
#define GL_UNSIGNED_SHORT_5_6_5_REV       0x8364
#define GL_UNSIGNED_SHORT_4_4_4_4_REV     0x8365
#define GL_UNSIGNED_SHORT_1_5_5_5_REV     0x8366
#define GL_UNSIGNED_INT_8_8_8_8_REV       0x8367
#define GL_UNSIGNED_INT_2_10_10_10_REV    0x8368

gl.h only include GL 1.1 stuff so you won’t find them in VC++ headers.
I don’t know what other compilers have.

Stupid question, but why does glext.h exists for Linux?
Why not just a brand new gl.h?

i would say that’s a stupid question if i could completely answer it :wink:

to be honest, i don’t know. i don’t know either if ati handles the header files the same way as nvidia does. with my nvidia-driver i got a gl.h, glx.h, glext.h and glxext.h. the gl.h can apparently be used for both windows and linux programming, since it contains platform-independent stuff, so there’s a reason to separate gl.h from the rest.

gl.h, by the way, includes glext.h automatically (unless you #define GL_GLEXT_LEGACY before including gl.h). and glx.h includes gl.h, so you actually only have to include glx.h. i guess that’s just some kind of backwards compatibility mechanism.

but why does glext.h exists for Linux?
gl.h defines everything that’s statically exported by the library (.dll on windows, .so on Linux).

glext.h defines everything else, that is, everything that has to be queried using GetProcAddress.

It is a common misconception that glext.h and the extension loading mechanism are there to work around the lack of updates on windows. These mechanisms exist to give applications the chance to work around missing features, instead of crashing at program start because of an unresolved import. This issue exists on every operating system, not just on windows.

Originally posted by Overmind:
[b] [QUOTE]glext.h defines everything else, that is, everything that has to be queried using GetProcAddress.

It is a common misconception that glext.h and the extension loading mechanism are there to work around the lack of updates on windows. These mechanisms exist to give applications the chance to work around missing features, instead of crashing at program start because of an unresolved import. This issue exists on every operating system, not just on windows. [/b]
It’s an interesting theory.

Keep in mind, Windows is the leading platform and since a new opengl32.dll is out of the question, I think wglGetProcAddress became a way out.

The real reason why wglGetProcAddress exists is to load extensions but it became applicable to core functions as well.

Also, when I said new opengl32.dll, I mean a file with a new name like opengl_1_2.dll
opengl_1_3.dll
opengl_1_4.dll
opengl_1_5.dll
opengl_2_0.dll
opengl_2_1.dll

It’s an interesting theory.
That’s not a theory, it’s a fact.

Believe what you want, but the extension loading mechanism is the only way to give the application a chance to code fallback paths. In your scheme, once an application links to opengl_2_1.dll, it won’t work if the driver does not support GL 2.1, with no fallback possible…

Originally posted by V-man:

Stupid question, but why does glext.h exists for Linux?
Why not just a brand new gl.h?

Video drivers can be and are (ex: ATI proprietary driver + X.org) updated independently from libGL.so on Linux, just as on Windows the ICD contains many functions not exposed through OPENGL32.DLL. So there needs to be some way to define interfaces to, and access extension functionality specific to the driver. It’s not exactly the same issue for Windows and Linux, but close enough that having a single glext.h for both makes sense.

You can link against the new dll. When you run your application, check for the GL version and the driver will respond.

There was the chance to do this on Linux systems. And isn’t libGL.so v 1.2?

Now I’m not sure about Vista. It is suppose to be a 1.4 dll right? We just need a new lib for VC++

It looks like I might have inadvertently started a heated debate. I have managed to get support for pretty much all uncompressed bitmaps working. This has been done however by doing conversions of anything below 24bit to RGB using the colour tables.
The particular code I am working on is a scene which is supposed to look like something from Tron. I am simulating back light compositing by using 8-bit greyscale bitmaps and then colourising them afterwards. It works fairly well but doesn ot quite achieve the spectacular glows of back light compositing. As long as I get a decent mark though for this coursework. Now I have to find or build a Tron Recognizor. I have code to parse .obj files. I did find a Recognizor 3D model but the authour wanted $40 for it. At that price I think I would rather build it myself. It is not like it is a really complex model.