opengl extension problem

Hi,
I have got the Nvdia’s GeForce 8800 card and am using opengl’s 1.2 version. i want to use extensions like GL_ARB_multisample and others but am not able to cos i donno where the problem is… i have tried the following:

  1. used glee.h and glee.c in the programs
    2)tried using glew.h and used it exactly as it is mentioned on
    http://glew.sourceforge.net/ (put all the dll,lib files in place etc)

with all this i never get any compilation error and my programs ,for instance,Multisampling runs but without the multisampling effect…sb pls help…

geforce 8800 and opengl 1.2 ??? You should at least have opengl 2.1 with such a card. Try getting your graphic driver from http://www.nvidia.com : otherwise you may only get software GL.

well, i have already installed the driver on this machine…on my other machine having Nvidia Quadro Fx 360M card also i have installed the driver…i tried the following on both the machines…

i have made a simple program of multisample…with

glutInitDisplayMode(…|GLUT_MULTISAMPLE);

glEnable(GL_MULTISAMPLE);

i have included glee.h and glee.c version 5_21…
but multisampling doesnt work though the program works ok without any errors…

Try testing for gl errors with glGetError(). Read the related docs.

Try to download and run the code at the bottom of this page :
http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=46

It that one has correct multisampling, then it means that your problem comes from glut. Apparently freeglut (an free opensource implementation of glut) has been patched to provide multisampling, try it.

i have tried the lesson46 from nehe.gamedev.net and it works perfect…so as u say i guess the problem is coming from glut…i am going to try it out now…so just a question is “freeglut” superior than glut and its absolute replacement?

i tried using freeglut also…but still it does not work…

Did you do

glutInitDisplayMode(…|GLUT_MULTISAMPLE);
glewInit (); // for glew to initialize the extensions

glEnable(GL_MULTISAMPLE);

?

This is the one thing many beginners forget about glew: You need to call that one function AFTER context-creation.

Jan.

Greeting everyone!
I have same problem.
I’ve put together glee on mingw.
After long corrections of amazingly stupid mistakes it works with no errors but GL_version returns 1.1.0 and therefore glee gives no extensions. Have no idea what it could be.
Edited: GLeeGetErrorString() is blank as if there were no errors. And there’s no wonder, because it loads extensions based on glGetString(GL_version). So it’s something before GLee.
Testing machine graphic adapter is GeForce 6600GT, drivers installed. Extension viewer detects OpenGL 2.0 and few functions from 2.1. Still glGetString(GL_version) gives 1.1.
Might there be some additional initialization?
On what libraries OpenGL version depends?
Anyone else had alike problems?

Btw GLee.c seems to have call for GLeeInit() in every extension-based gl function. Like:

void __stdcall _Lazy_glGenQueriesARB(GLsizei n, GLuint * ids) {if (GLeeInit()) glGenQueriesARB(n, ids);}
PFNGLGENQUERIESARBPROC pglGenQueriesARB=_Lazy_glGenQueriesARB;

My C++ is bad, but couldn’t that affect performance!?

I don’t think so.

I assume, that glee has its function-pointers default initialized to the “Lazy_gl…”-calls. The first time such a call is made, GLeeInit is called, which will initialize the library properly and overwrite the function-pointers with the correct ones. The SECOND call to any extension-function you do will therefore call the functions directly.

That is a very elegant and robust solution, since it does not need any awareness by the user (as glew does), when to initialize the library. It is only a lot more work for the library-writers.

Jan.

to Jan: you should be right

Found alike (no extensions) problem on http://www.gamedev.net/community/forums/topic.asp?topic_id=373109
Colorbits, etc. Didn’t helped.

There was mentioned - “If glGetString(GL_VENDOR) returns Microsoft then this means you do not have a hardware accelerated context.” And that pixelformat could be the cause. Is that posible?
Is passing “!PFD_GENERIC_FORMAT” flag (not to include it) legal?

Erm, is that the code you actually passed “!PFD_GENERIC_FORMAT” ? Or was that only a short form of the code?

Since this is a flag, if you want to exclude it, you should just not mention it, at all.
If you want to explicitly EXCLUDE it, you need to do this:

unsigned int someflags = …;

someflags &= ~PFD_GENERIC_FORMAT;

You cannot “invert” flags with the ! operator, you need to use the tilde (~) and then you need to AND it into the other flags to mask it out.

But actually omitting it altogether should do the same in this case.

Here is my code to pick a pixelformat:

	int iColors = 24;
	int iBPC = 8;

PIXELFORMATDESCRIPTOR pfd=
{
sizeof (PIXELFORMATDESCRIPTOR),
1,PFD_DRAW_TO_WINDOW|
PFD_SUPPORT_OPENGL|
PFD_DOUBLEBUFFER|
PFD_SWAP_EXCHANGE,
PFD_TYPE_RGBA,
iColors,iBPC,0,iBPC,0,iBPC,0,
iBPC,0,0,
0,0,0,0,
24,8,0,
PFD_MAIN_PLANE,
0,0,0,0
};

Hope that helps,
Jan.

Wow, what do people learn in programming classes nowadays? Oo I mean, this is basics of basics of binary algebra, which every programmer must know!

P.S. Seems I am becoming a real grumbler :stuck_out_tongue:

[quote=“Zengar”]

Wow, what do people learn in programming classes nowadays? Oo I mean, this is basics of basics of binary algebra, which every programmer must know!

P.S. Seems I am becoming a real grumbler :stuck_out_tongue: [/QUOTE]
Now it’s about “Real” programmers? ))… I haven’t seen any programming class in my life. Furthermore I haven’t seen any programmer. I am CG Artist. Still, I made my first DirectDraw and OpenGL (on BCB though) games when I was 15.
Does binary algebra includes c++ syntax?

Well, I started programming with 8 years, and I started by learning boolean algebra :stuck_out_tongue: Never learned C++ nor going to learn it. Well, I am not a “real” programmer myself, I am a linguist :stuck_out_tongue:

Please, don’t take me too personal, I never mean to offend (as already said, I am a grumbler). If you are CG artist and still writing programs, it is a feat and I respect it very much. Design is something I could never do, no talent :frowning: Still, if you are going to write programs, you should probably learn some basics. Going by instinct only is ok, but some training is really useful.

to Jan:
Thank you for clarification on flags )
I tried your pixelformatdescriptor but unfortunately result was same. I hand checked suitable pixelformats and they all included PFD_GENERIC_FORMAT.
Considering reinstalled new drivers and fact that they work thoughts left are pretty silly.
Have anyone really tried GLee? It gives no errors but does It works? Maybe Mingw winapi libs are somehow shortened…

Wow, 8 years… I barely knew multiplication table and that’s it…
Sure, everyone should study. Knowledges are always good. I could better know English (spell check underlines every third word literally) and perhaps Algebra too…

I haven’t used glee personally, i use glew and it works wonderfully. However, glew only needs one initialization function and since glee does the lazy initialization thing i am a bit confused that it does not work.

At the moment i am pretty much out of ideas, but maybe you should check the following:

Is it possible, that you call one of the extension-functions BEFORE you have set up your context properly? That would mean, that glee tries to initialize its functions without a valid OpenGL context and thus find nothing, which would result in you being stuck to version 1.2 AFTER the context is available. Make sure, that you call no OpenGL functions before the context is completed. Also make sure to have a wglMakeCurrent (hdc, hrc) call immediately after context-creation (i assume you work under windows, no?).

If everything else fails, PM me and i can send you my whole OpenGL initialization code (for windows), THAT should work.

Jan.

int format_count=DescribePixelFormat(hDC, 0, 0, NULL);
for(int f=0;f<format_count; f++)
{ DescribePixelFormat(hDC, f, sizeof(pfd), &pfd);

if((pfd.dwFlags & PFD_GENERIC_FORMAT) && !(pfd.dwFlags & PFD_GENERIC_ACCELERATED))
MessageBox (NULL, “No support”, “…”, MB_OK);

else if ((pfd.dwFlags & PFD_GENERIC_FORMAT) && (pfd.dwFlags & PFD_GENERIC_ACCELERATED))
MessageBox (NULL, “Some support”, “!!!”, MB_OK);

else if ( !(pfd.dwFlags & PFD_GENERIC_FORMAT) && !(pfd.dwFlags & PFD_GENERIC_ACCELERATED))
MessageBox (NULL, “FULL SUPPORT”, “!!!”, MB_OK);
}

all pixelformats are software. There’s 26 of them.
Extensions Viewer shows up 255, all different, all have WGL_ACCELERATION_ARB=true; wglinfo shows up about 50 pixelformats, 30 of them window-capable, most hardware acceleration capable…

FAQ of '99…
http://www.opengl.org/resources/faq/technical/openglfaq.txt
"To use SGI’s OpenGL for Windows you must link with OPENGL.LIB and GLU.LIB instead of OPENGL32.LIB and GLU32.LIB respectively. Also make sure that OPENGL.LIB and GLU.LIB precede GDI32.lib in the library list. Since SGI’s OpenGL for Windows library overloads the ChoosePixelFormat() function in GDI you need to make sure that you link the libraries in the right order so you don’t pick up the wrong version of ChoosePixelFormat().

DescribePixelFormat()? Wrong GDI lib?

First of all, is there a reason why you need to enumerate the pixel formats? If you need an OpenGl renderable format, throw DRAW_TO_WINDOW, SUPPORT_OPENGL 32bit RBGA into the ChoosePixelFormat and you will get the accelerated format if there is one.

I don’t understand why you only see 26 formats if there should be at least 50 visible? What dc are you using? Please make sure to use a SDI window (not a child window), specify CS_OWNDC for it’s class. Hm, did I forgot anything?

P.S. That FAQ you quote is about the SGI software implementation that was popular some time ago (it was a fast software implementation) but became obsolete when 3d hardware became common.