PDA

View Full Version : NVIDIA Driver Bug?



ViolentHamster
05-20-2009, 06:44 AM
I downloaded the ogl2brick example from here (http://3dshaders.com/home/index.php?option=com_weblinks&catid=14&Itemid=34).

If I use the GLEW library I get the red brick wall as expected, but if I remove the GLEW linkage and the call to glewInit, the wall is black. Removing GLEW should make the program use the calls supplied in the NVIDIA headers, right? Am I missing something?

I'm using 64 bit OpenSUSE and the 3.0.0 NVIDIA 180.51 driver.

dletozeun
05-20-2009, 07:42 AM
If it is a driver bug, mesa3D one is also concerned since I got almost the same issue without using glew as I said in your last thread. IMO, there are better chances that it comes from the ogl2brick application itself.

Does that happens with other simple applications that uses glsl shaders?

ViolentHamster
05-20-2009, 08:06 AM
I guess it could be some sort of glut issue. I tried writing a simple application using glut that accessed a texture in a shader, but once again everything was black.

It's hard to debug something that doesn't give errors and renders completely black objects.

Thanks for your help, dletozeun.

ViolentHamster
05-20-2009, 09:16 AM
I figured out the problem with my program. GL_TEXTURE_MIN_FILTER was set to GL_NEAREST_MIPMAP_NEAREST, but I wasn't generating mipmaps. Once I changed it to LINEAR or NEAREST, it worked.

I still have no clue what's up with the brick example though.

Thanks.

Jan
05-20-2009, 12:03 PM
"Removing GLEW should make the program use the calls supplied in the NVIDIA headers, right? "

Definitely not, it just breaks the program.

ViolentHamster
05-20-2009, 01:55 PM
Thanks Jan. Can you elaborate?

V-man
05-20-2009, 03:17 PM
So when you remove GLEW, what do you do? Do you get the function pointers with glXGetAddress?

Jan
05-20-2009, 04:02 PM
If i understand you correctly, you simply remove the call to glewInit and the linker-setting to link to glew.

If you don't do anything else, that simply means, that your extension-pointers won't be initialized AT ALL. There is no default fallback that kicks in. You would need to do everything yourself (or through another library such as glee).

If you include glext.h you get at least some enums defined, which you might be able to use, but you don't get any function pointers initialized. So basically, if your app still tries to access any of those functions, you should get a null-pointer exception.

So, how much your app gets "broken" by removing glew depends on how much it tries to use extensions, but in most cases you will get into trouble fast. What i really meant is, that by that move you really remove functionality, you don't get it replaced automatically, so if you were not aware of that, you basically introduced a "bug" that might bite you later on (but now you know ;-) ).


On Linux this might be less bad than on Windows, afaik on Linux you get much higher OpenGL versions by default, on Windows the default OpenGL version is 1.2 or 1.4, so using glew is essential there. On Linux i am not up to date what you get by default.

Jan.

ViolentHamster
05-22-2009, 06:25 AM
On Linux this might be less bad than on Windows, afaik on Linux you get much higher OpenGL versions by default, on Windows the default OpenGL version is 1.2 or 1.4, so using glew is essential there. On Linux i am not up to date what you get by default.

Jan.

Yes, on Linux this is much less bad than Windows. From what I understand, Microsoft hasn't updated the OpenGL dll in forever. On Linux, when you install an NVIDIA driver, you get the latest headers and everything is updated so all you do is use the GL calls. So, you don't need to use a library like glew.

dletozeun
05-22-2009, 02:27 PM
I figured out the problem with my program. GL_TEXTURE_MIN_FILTER was set to GL_NEAREST_MIPMAP_NEAREST, but I wasn't generating mipmaps. Once I changed it to LINEAR or NEAREST, it worked.


Yes the default minification filter requires to generate mipmaps, though I have never understood why, it is a source of many problems especially when rendering to texture with fbo...

Anyway, since your hardware/driver supports OpenGL 3.0, your program does not require any extension to use GLSL shaders since it has been promoted to core feature in opengl 2.0.

About the ogl2brick example, I don't have more ideas since I did not look into its code.

Overmind
05-24-2009, 10:11 AM
On Linux this might be less bad than on Windows, afaik on Linux you get much higher OpenGL versions by default, on Windows the default OpenGL version is 1.2 or 1.4, so using glew is essential there. On Linux i am not up to date what you get by default.

Jan.

Yes, on Linux this is much less bad than Windows. From what I understand, Microsoft hasn't updated the OpenGL dll in forever. On Linux, when you install an NVIDIA driver, you get the latest headers and everything is updated so all you do is use the GL calls. So, you don't need to use a library like glew.



That's not quite true. Afaik the Linux ABI mentions version 1.5 as the minimum, everything above it *should* be loaded with glXGetProcAddress.

With some drivers it might work without the extension mechanism, but with some drivers it might not work. So if you care about your application running on every system, you better use that extension mechanism. If you just statically link to a higher GL version, you'll just get a cryptic linker error on application startup on some drivers.

The bottom line is: The extension mechanism is there for a reason. It is *not* a hack to work around some restriction in the windows dll.

One thing is really better on Linux, though: You can call glXGetProcAddress before creating a context to get these new context creation entries ;)

Stephen A
05-24-2009, 02:25 PM
One thing is really better on Linux, though: You can call glXGetProcAddress before creating a context to get these new context creation entries
Unless you are using ati hardware with the closed-source drivers, where this is a recipe for a crash. :)