View Full Version : glTexImage2D() paramter
06-28-2001, 04:27 PM
I am using the following function call:
where data is an Width*Height array of GLfloats.
works fine... and shows me a grayscale image of my raw data.
now if I change the GL_LUMINANCE to GL_LUMINANCE_ALPHA, it should make the darker values see through, right?
But it doesn't it just shows the whole texture as black!!
P.S. Blending is setup and enabled...
06-28-2001, 05:06 PM
Yeah ive been wondering how to get that to work also. Lets say I want an explosion scar. Isn't data supposed to be the explosion scar texture???!?
06-28-2001, 05:10 PM
hmm... have you looked into using lightmaps at all?
06-28-2001, 07:13 PM
Have you got a reference for glTexImage2D anywhere? It's probably outdated since your third parameter shouldn't be an integer value anymore. From memory older MSDN uses integers in its help system description of the function so you should update your MSDN documentation.
The third parameter is the internal format of the texture i.e. how OpenGL stores the texture internally. Integer values for internal format are kept to ensure compatibility with OpenGL 1.0. What you should have for a luminance-alpha texture is GL_LUMINANCE_ALPHA. You can also specify the bits used for each e.g. GL_LUMINANCE8_ALPHA8.
So, in your case, your call should look like this:
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_ALPHA, Width, Height, 0, GL_LUMINANCE_ALPHA, GL_FLOAT, data);
Of course I'm assuming you've set up your texture correctly with a luminance and alpha value for each texel? http://www.opengl.org/discussion_boards/ubb/wink.gif
Hope that helps.
Powered by vBulletin® Version 4.2.3 Copyright © 2017 vBulletin Solutions, Inc. All rights reserved.