Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 9 of 9

Thread: Legacy OpenGL 2.0 support on windows 7 and modern nvidia cards

  1. #1
    Junior Member Newbie
    Join Date
    Mar 2010
    Posts
    4

    Legacy OpenGL 2.0 support on windows 7 and modern nvidia cards

    I humbly come in search of help from the collective.

    A very long time ago (2007), I wrote some openGL 2.0 code to produce some 3D preview software that I use for work. I learned openGL 2.0 back in 1997 or so, and simply reworked some of my old ogl 2.0 code rather than rewriting from scratch with up to date openGL.

    It was a simple affair; a light, no textures, averaged normals, some transparency, and a little dinky GLUI interface. But it worked for displaying the data I needed, until recently. In the last year or so, I've noticed the code crashing on some windows systems due to a lack of OpenGL 2.0 legacy support. I know it's openGL legacy support because it runs perfectly until I start hitting the immediate mode stuff, and then it dies. My understanding was the nvidia was supporting Opengl Legacy code, but apparently not.

    Obviously rewriting my code is now on my radar, but I remain extremely busy and cannot drop everything to do it. Does anyone know a good workaround that can give me back legacy support for openGL 2.0 in windows 7 on modern nvidia cards? What's bizarre to me is that at home, my sandybridge running a GTX 465 runs my old code fine, but my work machine, an ivybridge running a GTX660, doesnt. Both are windows 7 boxes. Either nvidia dropped openGL 2.0 support at some point between those generations (can anyone verify this?) or there exists some sort of driver level solution.

    Even more bizarre is that I've noticed that some windows 7 machines will run my code for some time, and then one day stop supporting my code. No change in video driver, no change in OS or hardware. Perhaps there is a dynamic linking thing going on? Can anyone shed some light on this problem?

    Mino

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,117
    Not sure what your problem is - if you initialize your driver in compatibility mode it should work fine. We run a lot of old code without any problems on nVidia cards

  3. #3
    Member Regular Contributor
    Join Date
    Jun 2013
    Posts
    490
    I don't know of a single vendor that doesn't support legacy OpenGL. There's too much software which uses it.

    It's far more likely that your code just has an intermittent bug, e.g. an out-of-bounds array access.

  4. #4
    Junior Member Newbie
    Join Date
    Mar 2010
    Posts
    4
    Quote Originally Posted by tonyo_au View Post
    Not sure what your problem is - if you initialize your driver in compatibility mode it should work fine. We run a lot of old code without any problems on nVidia cards
    I've never done anything with my drivers before - I didnt know how to - what is compatibility mode, and how do I initialize it?

    It's far more likely that your code just has an intermittent bug, e.g. an out-of-bounds array access.
    Sorry, this is not the case. Input that used to work has stopped working. Also, the code has worked flawlessly for almost 6 years on a huge range of inputs, and I thoroughly debugged it for mem leaks and array issues. Given that the code is basically a "draw a sphere and rotate it with a mouse" tutorial with a few minor features, the level of complexity is too low for me to have coded a bug there with any probability. I wish it was my own bug, though - I could have fixed it by now.

    Also, it is interesting to note that drawing axes with glLines appears to have no problem:

    glBegin(GL_LINES);
    glVertex3f(0.0, 0.0, 0.0);
    glVertex3f(15, 0, 0);
    glEnd();

    But attempting to draw any GL_TRIANGLEs creates a crash.



  5. #5
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,183
    Despite what you say I'm reasonably certain that it is your own bug.

    Here's a test. Try an older OpenGL game - download a demo for one of the Quakes, for example. If that works then you have confirmation - it's your own bug.

  6. #6
    Member Regular Contributor
    Join Date
    Jun 2013
    Posts
    490
    Quote Originally Posted by mhagain View Post
    Try an older OpenGL game - download a demo for one of the Quakes, for example.
    At least one of the Quake games has a bug caused by copying the result of glGetString(GL_EXTENSIONS) to a fixed-size buffer. Of course, the buffer was more than adequate for OpenGL implementations of that era, but that string is typically much longer on modern cards.

  7. #7
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,183
    Quote Originally Posted by GClements View Post
    At least one of the Quake games has a bug caused by copying the result of glGetString(GL_EXTENSIONS) to a fixed-size buffer. Of course, the buffer was more than adequate for OpenGL implementations of that era, but that string is typically much longer on modern cards.
    NV drivers will silently chop the extensions string to work around this (confirmed with 320.18). They used to have an explicit option in their control panel but that's long gone - nowadays they just chop it whether you want or not.

  8. #8
    Junior Member Newbie
    Join Date
    Mar 2010
    Posts
    4
    Quote Originally Posted by mhagain View Post
    Despite what you say I'm reasonably certain that it is your own bug.

    Here's a test. Try an older OpenGL game - download a demo for one of the Quakes, for example. If that works then you have confirmation - it's your own bug.
    I stand corrected. I ran quake 2 with "default openGL" (man that brought back memories) and it worked fine. All this time there must have been some sort of abuse in my code that I was doing. I guess I'll look into it even more.

    Thanks for proposing this test. It clears everything up. I must be wrong.

  9. #9
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    3,194
    Quote Originally Posted by Minotaar View Post
    ...what is compatibility mode, and how do I initialize it?
    It's what you get by default, with a normal {glX,wgl}CreateContext.

    If you want any special options on your OpenGL context creation, you need to use one of the new context creation calls ({glX,wgl}CreateContextAttribsARB).

    With a compatibility profile, you can gradually excise "the old way" code (immediate mode, built-in matrices, etc.) from your codebase as time/money permits, while not breaking your application as you migrate up.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •