NVIDIA GeForce 2 graphics corruption using OpenGL

My application I’ve created runs just fine on most graphics cards except a NVIDIA card running Windows 98 on a Dell. As far as I know I have the up to date drivers. The graphics shows all the lines between polygons and textures. You can actually see how the scene was drawn. In addition the frame rate is extremely low. How do I fix this???

[This message has been edited by DarthPaul (edited 02-01-2001).]

I am having a (sort-of) similar problem with the GeForce2. I write an app on a workstation with a 16mb Rage or something, and everything runs fine. When it’s ported to the GeForce2 machine, I don’t even get half the framerate and the lighting and fog are all out of whack. I’ve tried to contact nVidia, but they don’t want to hear from a non-registered coder so I called my Dell support rep and …, well, they just don’t know how to support programming questions. I’m thinking that it must be a software problem, but I’d really appreciate it if someone could shed some light on the subject.

Thanks,
Dave

This is a bit of a wild stab in the dark but we have solved various problems on a Geforce2 but doing the following:

Upgrading the Bios (this fixed a BIG problem for us)

Checking whether your agp settings are correct.

These may not help but are worth trying.

Sounds like a driver problem. Which drivers are you using? The ones supplied by the card manufacturer (Creative/ASUS etc) are usually not as good as the “Detonator 3” series provided directly by nVidia.

Check out their website (http://www.nvidia.com/products.nsf/htmlmedia/detonator3.html) for more…

Hope this helps - Jules

By the way, when I said ‘upgrade the bios’ i meant on the GeForce…

Ok, so I’m still having this problem. I know it has been a while but I’ve been busy. Do I have to use NVIDIA’s extentions to correct the problem??? Am I going to have to write graphics card specific code just to get it to work???

Perhaps this will help, it’s the code I use to set up openGL.
// set pixel format
PIXELFORMATDESCRIPTOR pixel_format_description;
// try to get maximum resources
ZeroMemory(&pixel_format_description, sizeof(PIXELFORMATDESCRIPTOR));
pixel_format_description.nSize = sizeof(PIXELFORMATDESCRIPTOR);
pixel_format_description.nVersion = 1;
pixel_format_description.dwFlags = PFD_SUPPORT_OPENGL | PFD_DRAW_TO_WINDOW | PFD_DOUBLEBUFFER;
pixel_format_description.dwLayerMask = PFD_MAIN_PLANE;
pixel_format_description.cColorBits = window_depth;
pixel_format_description.cAlphaBits = window_depth;
pixel_format_description.cDepthBits = window_depth;
// determine format selection
display_context_window = GetDC(window_handle);
int pixel_format = ChoosePixelFormat(display_context_window, &pixel_format_description);
if (!pixel_format)
{
return false;
}
// finalize format selection
if (!SetPixelFormat(display_context_window, pixel_format, &pixel_format_description))
{
return false;
}
if (!DescribePixelFormat(display_context_window, pixel_format, sizeof(PIXELFORMATDESCRIPTOR), &pixel_format_description))
{
return false;
}
// create rendering context
display_context_opengl = wglCreateContext(display_context_window);
if (!display_context_opengl)
{
return false;
}
// enable rendering context
if (!wglMakeCurrent(display_context_window, display_context_opengl))
{
return false;
}
// create OpenGL frustrum
glViewport(0, 0, GetSystemMetrics(SM_CXSCREEN), GetSystemMetrics(SM_CYSCREEN));
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
display_aspect = (double)(GetSystemMetrics(SM_CXSCREEN)) / (double)(GetSystemMetrics(SM_CYSCREEN));
gluPerspective(display_fov, display_aspect, display_front/100, display_back);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

// load OpenGL state
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDisable(GL_CULL_FACE);
glCullFace(GL_BACK);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LESS);
glDisable(GL_DITHER);
glEnable(GL_TEXTURE_2D);
glEnable(GL_POLYGON_SMOOTH);
glHint(GL_FOG_HINT, GL_NICEST);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST);
glShadeModel(GL_SMOOTH);

I’ve heard of this problem before but am still extremely confused about it, because never once have I seen a screenshot of it, nor complete source code and executable for an app that triggers it.

Suffice it to say that I’ve never encountered the problem.

  • Matt

This is a guess (that might narrow down the problem): try disabling blending and see what happens.

Sorry, but you get what you programmed.

>>glEnable(GL_BLEND);
>>glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
>>glDisable(GL_CULL_FACE);
>>glEnable(GL_POLYGON_SMOOTH);
>>glEnable(GL_DEPTH_TEST);
>>glDepthFunc(GL_LESS);

These are the problematic function calls.
You have switched on polygon antialiasing and z buffering. All polygon edges of adjacent triangles will be blended only once, because after that the depth buffer is set and the GL_LESS test fails for the next fragment of an adjacent edge that would contribute to the coverage values.

All graphcis boards which don’t show the cracks do the wrong thing, probably ignoring polygon antialiasing.

The recommended way to render antialiased polygons is without depth test, sorted from front to back with glBlendFunc(GL_SRC_ALPHA_SATURATE, GL_ONE).

Read the RedBook V1.1 Chapter 6 Polygon Antialiasing example 6.5.

Remove all calls which have to do with polygon antialiasing including the glHint and the “cracks” will disappear. Performance should be full speed again then.

Same problem was solved here before: http://www.opengl.org/discussion_boards/ubb/Forum2/HTML/000931.html

[This message has been edited by Relic (edited 03-01-2001).]

Doesn’t enabling GL_POLYGON_SMOOTH make your app resort to software rendering on a GeForce? (It is typically ignored on older hardware)

-Won

It works. Thanks, I commented GL_POLYGON_SMOOTH out of my sorce and now it runs fine. Thanks, Won and Relic.