Lesson 1: How to crash the NVIDIA Linux Detonator Driver

Before I continue writing this post, I have to admit that I really love the binary NVIDIA drivers, and they’re rock solid when playing any OpenGL accelerated game (including UT2K3), and I’m very thankful to the NVIDIA guys for making these drivers available.
(otherwise I would have to buy a Matrox card without all of those nice GL extensions )

Anyway, altough everything seems to be very stable, I do have a large problem with these drivers; some (GL-only-)programs crash/reset my X server.

I’m already having this problem for 4 months now and I’m really getting sick of it.
Not only because this bug is very irritating, but it’s also VERY hard to debug because of the crashing X server.
And it’s even more difficult to develop software when X crashes everytime I do a testrun.

I’ve tried many solutions from many websites, including changing all of the environment variables to every possible value and every possible combination and the same for options in my XF86Config.

statistics:

  • [li]All programs that use the wxGLCanvas from wxGTK ( wxWindows ) crash my X server.[]Some programs that use gtkglext (GL widget for Gtk2) also crash X.[]The NeHe GLX tutorials don’t crash.[*]Humus’ demos also don’t crash.

What I currently know:

  • [li]When I comment out calls to glXSwapBuffers() X doesn’t crash[]From gtkglext, only the pixmap- demos crash… I’ve noticed that this demo does not crash when commenting out the displist calls (drawing without displists does also make X crash)[]16 bpp, 32 bpp… it doesn’t matter (neither for screendepth and/or visuals, altough I do know that wrong visuals make most GL programs quit with a segfault)[]These crashes don’t exist when using the open source ‘nv’ driver (but I’m not going to use that one because it’s too slow)

My system specs:

[ul][li]Red Hat 8.0[]Kernel 2.4 and 2.5 (crashes on both)[]RH8 XFree 4.2.0[]AGPGART does not get loaded because my dumb Via chipset is not supported (neither by nvAGP)[]To be sure that it’s not an APG bug (on my side at least), I’ve disabled support for both AGPGART and nvAGP in my XF86Config[]GeForce3 Ti 200[]Latest Linux Detonator Driver (not 40.xx)[*]Chipset: VIA Pro Savage PM133, this chipset has already caused a lot of problems for me when I was using Win2K.[/li]Every Direct3D program crashed with it.
The crashes went away when I started using WinXP (with better support for Via chipsets I believe, the win2k Via AGP driver was flaky (including the 4-in-1 drivers))[/ul]

I really hope someone knows more about this, because it’s making me really mad…
I’m also not going to buy another mainboard or videocard, it should work the way it is.

[This message has been edited by richardve (edited 12-07-2002).]

[This message has been edited by richardve (edited 12-07-2002).]

Hmm… just a random brainfart coming up…

In this pixbuf demo (gtkglext) I was talking about, I just noticed that it is trying to render to an offscreen pixmap.

That doesn’t sound all that good to me, I thought one would need the pbuffer extension to render to offscreen buffers?
(gtkglext doesn’t use any extension as far as I can see)

Hm… this could explain the crash when starting to render things (even a simple GL_POINT crashes X)

[This message has been edited by richardve (edited 12-07-2002).]

Out of curiosity I checked the gtkglext demos on my machine and everything runs fine. Even all the pixmap examples.

Redhat 8.0
Kernel 2.4.20
XFree86-4.2.0-72
Nvidia 3123
Ali 1541 (i think or was it 1547) Chipset
Geforce 2MX

Sorry that I can’t offer any help.

Well, even if you don’t have the problem, I do think this is a bug…
(maybe in combination with my chipset, which is known to be stupid)

I’ve just been browsing the code from wxGLCanvas and it looks like wxWindows is also trying to render into an offscreen pixmap.

To prevent people from saying that this is not true: in wxGLContext::SwapBuffers there’s the following code:

if (m_glContext)
{
GdkWindow *window = GTK_PIZZA(m_widget)->bin_window;
glXSwapBuffers( GDK_DISPLAY(), GDK_WINDOW_XWINDOW( window ) );
}

That GdkWindow is not an X window in this case, so that means wxWindows is also rendering into a pixmap (afaik) and (partly) solves my problem, yeay!

I’m going to make a tiny demo app to prove that I’m right… and I really hope I am…

[This message has been edited by richardve (edited 12-07-2002).]

Oh, and don’t get confused by the GDK_WINDOW_XWINDOW macro…

As one can see on the following page: click

It can return a window or pixmap (yeay!)

To use a generic VIA AGP driver, go to your kernel’s source tree, and find the file drivers/char/agp/agpgart_be.c; you want to set agp_try_unsupported to 1 and rebuild your kernel (or the module, however it is on your system)… that should get you AGP support.

Dan

Thanks, I’ve seen that message about enabling agp_try_unsupported in my boot logs some time ago, but I totally forgot after a while to change it.

I will try it in a few hours…

Woah!

That’s a huge speed increase
(UT2K3 for example is a lot faster now, and the textures seem to be larger in size)

Status: Enabled
Driver: AGPGART
AGP Rate: 4x
Fast Writes: Disabled
SBA: Disabled

You don’t have to modify the driver though…
I’ve just added the following option to my /etc/modules.conf to make it work:

add options agpgart agp_try_unsupported=1

(and all of that works without any reboot… man, I love Linux )

btw. it doesn’t make my system more or less stable though.
Rendering to offscreen pixmaps (without using any extensions) is still impossible to do without making X crash/reset.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.