PDA

View Full Version : Nvidia 8600m GT and OpenGL 3.0 support?



soconne
03-08-2009, 10:10 PM
I have a Dell m1530 with an Nvidia 8600m GT card. I downloaded the desktop 182.06 drivers with a modified inf file so setup would recognize my card. I'm able to create an OpenGL 3.0 context, but I'm unable to initialize any extensions.

If I create the context and then call glGetString(GL_EXTENSIONS), it always returns NULL. If I initialize a normal OpenGL 2.0 context, all extensions are returned properly.

Has anybody had any luck with the 8600m card and OpenGL 3.0? Is there a compatible notebook driver? I've tried 179.48, but it doesn't have support for OpenGL 3.0.

Also, on a side note, I'm using glew and calling glewInit returns no error under both 2.0 and 3.0. But GLEW_ARB_vertex_buffer_object is always false and so is every other extension.

randall
03-09-2009, 02:09 AM
Try setting glewExperimental to GL_TRUE before glewInit().

soconne
03-09-2009, 07:13 AM
Thanks randall, GLEW_ARB_vertex_buffer is now no longer null. But calling glGetString(GL_EXTENSIONS) still returns NULL :-/

martinsm
03-09-2009, 07:17 AM
Are you creating full context of forward compatible? In forward compatible context GL_EXTENSIONS is deprecated argument for glGetString. Instead of glGetString you should use glGetStringi(GL_EXTENSIONS, i).

soconne
03-09-2009, 07:17 AM
I just noticed by removing the following attribute from the call to wglCreateContextAttribs:
WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB

I get all extensions now. That's very odd. Any ideas? Also, if I keep these flags in, when I attempt to render geometry using VBOs nothing gets rendered. But in a 2.0 context or by commenting out the flags above, geometry is rendered.

Here's the code.



void display()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

float colors[] = {
1.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 0.0f, 1.0f
};

float vertices[] = {
0.0f, 0.0f,
100.0f, 0.0f,
100.0f, 100.0f
};

unsigned int indices[] = {
0, 1, 2
};

static GLuint vao = 0;
static GLuint cbo = 0, vbo = 0;
if (!cbo && !vbo) {
if (!GLEW_ARB_vertex_buffer_object)
printf("vertex_buffer_object not supported\n");

/*glGenVertexArrays(1, &vao);
glBindVertexArray(vao);*/

glGenBuffersARB(1, &cbo);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, cbo);
glBufferDataARB(GL_ARRAY_BUFFER_ARB, sizeof(colors), colors, GL_STATIC_DRAW_ARB);

glGenBuffersARB(1, &vbo);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, vbo);
glBufferDataARB(GL_ARRAY_BUFFER_ARB, sizeof(vertices), vertices, GL_STATIC_DRAW_ARB);
}


glEnableClientState(GL_COLOR_ARRAY);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, cbo);
glColorPointer(3, GL_FLOAT, 0, 0);

glEnableClientState(GL_VERTEX_ARRAY);
glBindBufferARB(GL_ARRAY_BUFFER_ARB, vbo);
glVertexPointer(2, GL_FLOAT, 0, 0);

//glBindVertexArray(vao);
//glDrawElements(GL_TRIANGLES, 3, GL_UNSIGNED_INT, indices);
glDrawArrays(GL_TRIANGLES, 0, 3);

gltSwapBuffers();
}





Are you creating full context of forward compatible? In forward compatible context GL_EXTENSIONS is deprecated argument for glGetString. Instead of glGetString you should use glGetStringi(GL_EXTENSIONS, i).

Thanks, I didn't know that about glGetString being deprecated. But I still get null for each call to glGetStringi :-(

And yes, I am adding the forward compatible context attribute to my list of attribs when calling the CreateContextAttribs method.

martinsm
03-09-2009, 07:24 AM
Are you using correct values for i? i can change from 0 to glGetInteger(GL_NUM_EXTENSIONS)-1.

If you get nothing rendered try checking the error codes of each OpenGL call with glGetError. In forward compatible context lot of stuff is deprecated - read OpenGL 3.0 spec Appendinx E. For example - fixed functionality, vertex arrays, GLSL 1.10 and 1.20 - all of this is deprecated and can not be used.

soconne
03-09-2009, 07:35 AM
I'm querying the extensions like this.



GLint n, i;
glGetIntegerv(GL_NUM_EXTENSIONS, &n);
for (i = 0; i < n; i++) {
printf("%s\n", glGetStringi(GL_EXTENSIONS, i));
printf("ERROR: %s\n", gluErrorString(glGetError()));
}


gluErrorString is printing out "Invalid enumerant" for every glGetStringi call.

martinsm
03-09-2009, 08:01 AM
Maybe error already happened before glGetStringi? Try calling glGetError before extension queries.

I have no problems querying extensions in this way on my laptop with 8400M GS video card.

M/\dm/\n
03-09-2009, 11:29 AM
I have the same problem with 8800GT. glGetError returns 0, but glGetStringi returns NULL pointers on forward compatible context. I started a thread not too long ago about GL 3.0 here with source code in it.

BTW, I'm also from Latvia :D

martinsm
03-09-2009, 02:05 PM
Heh, I tried to install 182.06 version driver. Now I also get NULL value from glGetStringi. Previously I was using 181.00 driver.

Great. Now you broke my OGL3 applications :)

Y-tension
03-11-2009, 08:46 AM
I noticed that using normal context creation(not 100% for default initialization but with wglChoosePixelFormat it worked for me) results in a 3.0 context being created.Check it out. Now, I know this shouldn't happen..but anyway it's there.

Y-tension
03-12-2009, 07:48 AM
Just saw that no context is created when calling glCreateContextAttribsARB with forward compatible bit set. Can anyone verify this? If it happens to another maybe we can post a driver bug.

EDIT:
No problem after all, just forgot to end my attribute list with 0. Sorry for the hassle. I will post if I find any problems with extensions.