Help with multitexturing

I’m trying to write a small program that draws a multitextured quad.

The code crashes on the line:
glActiveTextureARB(GL_TEXTURE0_ARB);

There is a global:
PFNGLACTIVETEXTUREARBPROC glActiveTextureARB = NULL;

I check that the extension is supported and then I call:
glActiveTextureARB = (PFNGLACTIVETEXTUREARBPROC)
wglGetProcAddress(“glActiveTextureARB”);

Anyone know what is wrong?

Thanks once again for the help.

Header:

#define GL_TEXTURE0_ARB 0x84C0

typedef void (APIENTRY * PFNGLACTIVETEXTUREARBPROC)(GLenum target);

Init:

PFNGLACTIVETEXTUREARBPROC glActiveTextureARB = NULL;

// GL_ARB_multitexture
if(QueryExtension(“GL_ARB_multitexture”) == true)
{
glActiveTextureARB = (PFNGLACTIVETEXTUREARBPROC)wglGetProcAddress(“glActiveTextureARB”);
if(glActiveTextureARB == NULL)
{
Error_ExitProgram();
}
}

Are you doing it in a similar way?

Diapolo

I did some more investigating.

There are a few things that are bad and I dont know why. I dont have the code in front of me but here is the gist of it.

glGetString(EXTENSIONS) only returns 3 or 4 extensions. But the same call in another program returns dozens of extensions. I am using the same glExt.h file in both programs. Another than that, I do everything the way you have it.

Originally posted by lucidmm:
glGetString(EXTENSIONS) only returns 3 or 4 extensions. But the same call in another program returns dozens of extensions. I am using the same glExt.h file in both programs. Another than that, I do everything the way you have it.

whatever the version of glext.h isnt important cause it has nothing to do with the number of extensions u getback from a call to getstring( extensions ).

possible reason is one version is hardware accelerated + the other version is in software use glGetString( GL_VENDER ) to check.
make sure u have a valid opengl rendering context first

Right, make sure, you are rendering not via the MS OpenGL implementation.
GL_VENDOR string is: Microsoft Corporation
GL_RENDERER string is: GDI Generic

Diapolo

The calls to GL_RENDERER and GL_VENDOR returned:

GL_VENDOR string is: Microsoft Corporation
GL_RENDERER string is: GDI Generic

I am using glut and I had the function call:
glutInitDisplayMode ( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_ALPHA);

When I changed the call to:
glutInitDisplayMode ( GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);

everything worked and GL_VENDOR returned NVIDIA. Can someone explain what happened?
Thanks a lot for all the help.

You’re not running in 16 bit, are you?

– Tom

Yep, what does that do?

You can’t get destination alpha in 16 bits, so if you request it, it’ll probably boot you into software mode. People reported exactly the same problem to me when I recently released a per-pixel lighting demo on my site (www.delphi3d.net).

I don’t get it. This is the 21st century, for cryin’ out loud! Why are so many people still running their displays in dark-age color depths?

– Tom

Thanks. I put it in 32 bit mode.

I don’t get it. This is the 21st century, for cryin’ out loud! Why are so many people still running their displays in dark-age color depths?

Abandon the option for 16 Bit color depth, use fullscreen and only allow 32 Bit modes .

Diapolo

You don’t have to go fullscreen, you can just change the bit depth using ChangeDisplaySettings, then create your window.
I know it’s rude, but sod 'em.

You are right, but I dislike windowed mode .

Diapolo

I’m the other way round - if I download a demo, I can’t help but sigh when my monitor clicks as they change my resolution - especially if I’ve got lots of apps running at the same time, as the swap time becomes very annoying (and if I’m lucky, they won’t have messed up all my icon positions)