Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 10 of 10

Thread: Shader/glDrawArrays crash only on nVidia/Win32

Hybrid View

  1. #1
    Junior Member Newbie
    Join Date
    Jan 2013
    Posts
    4

    Shader/glDrawArrays crash only on nVidia/Win32

    I'm the author of an open source, cross-platform freeware 3D engine dim3 (www.klinksoftware.com). Two components: the engine and the editor. The editor is all fixed function, it works everywhere. The Engine is all shader code. It works on iOS (es2), OS X with Intel GPU, nVidia, or ATI(AMD) cards, and on the PC with ATI(AMD) or Intel GPU.

    On nVidia, it always crashes, the first shader call (against a glDrawArrays), with pretty much any shader, no matter how simple.

    Sadly, I don't have this configuration, and debug code to my users isn't getting me anywhere. Has anybody encountered this? Is there a way to get further debug information? I'm sure it's something simple the PC nVidia drivers are doing differently, maybe they require things to be set in a certain order?

    Further notes: No built-in variables are used, all vertex, uv, matrixes, etc, are passed in by vertex arrays or uniforms. Everything uses VBOs. The shader compile correctly, and all the locations have proper integers.

    You can find the latest PC build at the url above. Should crash right away if you have a PC/nVidia setup.

    Any ideas? Anything for me to try?

    [>] Brian

  2. #2
    Junior Member Newbie
    Join Date
    May 2002
    Posts
    22
    Quote Originally Posted by ggadwa View Post
    I'm the author of an open source, cross-platform freeware 3D engine dim3 (www.klinksoftware.com). Two components: the engine and the editor. The editor is all fixed function, it works everywhere. The Engine is all shader code. It works on iOS (es2), OS X with Intel GPU, nVidia, or ATI(AMD) cards, and on the PC with ATI(AMD) or Intel GPU.

    On nVidia, it always crashes, the first shader call (against a glDrawArrays), with pretty much any shader, no matter how simple. ... [>] Brian
    What is the crash message ?

  3. #3
    Junior Member Newbie
    Join Date
    Jan 2013
    Posts
    4
    It's an access violation.

    As always, the minute I post this, I think I've found the problem, but will need to verify with my users, hopefully by tomorrow. It has to do with glEnableVertexAttribArray and glVertexAttribPointer.

    I've got some enables that are leaking to shaders where no AttribPointer call is made -- because -- as it always is -- I was getting a bit too aggressive with the optimizations. But here's the interesting part: The IDs aren't hooked up to any attributes in the shader code. It works everywhere, except PC/nVidia. The drivers must be doing some kind of pre-flight check, and that's causing the access violation.

    I got away with this for a long time, until I ran into a user with that setup.

    For instance, I might have A, B, and C all enabled, only A & B have offsets into the VBO set, but only A & B are used in the shader, or referenced at all.

    Is what the driver is doing right or wrong? It's certainly checking data it will never use, but then again, I shouldn't be enabling data without setting a pointer to it. I'll have more later when I know if this is the real reason.

    If anything, it's an interesting difference in the drivers.

    [>] Brian

  4. #4
    Junior Member Newbie
    Join Date
    Jan 2013
    Posts
    4
    Yes, that's what it was.

    So, for anybody else that searches and stumbles onto this:

    For only nVidia PC drivers (not OS X), if you enable a vertex array that doesn't exist in the shader, it'll crash with a access violation when you attempt to draw on the shader. Other drivers ignore this, as it's really a no-op.

    [>] Brian

  5. #5
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,072
    Thanks for the "reference". Few years ago Alfonse said that it was nonsense when I said all unused attributes have to be disabled to prevent application crash. I'm using NV hardware for years and this behavior is quite natural to me.

  6. #6
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    Yes, and it's still in violation of the OpenGL specification. Nowhere does it allow such a thing to end with program termination; therefore, it should not.

    Complain to NVIDIA about it, not to me.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •