Part of the Khronos Group

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 10 of 10

Thread: Drawing without attrib array enabled gets "optimized out" by AMD driver?

Threaded View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Junior Member Newbie
    Join Date
    Aug 2012

    Drawing without attrib array enabled gets "optimized out" by AMD driver?

    I recently had to replace my laptop and in the process got something with an AMD GPU to complement my Nvidia "testing hardware". It turned out that two of my examples didn't work. The first one was a clear case of the Nvidia driver letting me get away with something that it shouldn't (I wasn't binding the whole relevante range of a buffer with BindBufferRange), but the second one is puzzling me.
    In my tessellation example I draw without any vertex attribs by deriving vertex positions from gl_VertexID and gl_InstanceID and then tessellating the result according to a "displacement texture". On AMD it simply does nothing. Black screen, no errors and a GL_PRIMITIVES_GENERATED query reveals that it doesn't even generate primitives. Just creating a dummy vao+vbo and specifying a AttribArray with stride 0 (that isn't read in the shader) makes it work.
    So my interpretation is that the driver "optimizes" out the drawcall when there is no attrib array enabled? I can make it work with that dummy array, but what I really wonder is whether that is valid behavior? I so far couldn't really clarify this by just reading the spec, but I might be looking in the wrong places.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts