Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 2 12 LastLast
Results 1 to 10 of 18

Thread: glDrawElements

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Junior Member Newbie
    Join Date
    Jul 2017
    Posts
    14

    glDrawElements

    Hi to everyone!

    I'm having some troubles with glDrawElements function, since I found different approaches of how it works. So, basically I don't understand why such code doesn't work:
    Code :
    GLuint array[3] = {0, 1, 2};
    glDrawElements(GL_POINTS, 3, GL_UNSIGNED_INT, (GLvoid *) array);

    Though, everything is fine if I change it to:
    Code :
    glDrawArrays(GL_POINTS, 0, 3);

    Why is it so? In OpengGL 2 it worked, as far as I remember.

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,716
    This won't work in modern OpenGL core contexts because elements must be sourced from a GL_ELEMENT_ARRAY_BUFFER object.

  3. #3
    Junior Member Newbie
    Join Date
    Jan 2017
    Posts
    13
    Hi, I'm not the one that questioned, but you are "OpenGL Pro" so, you must know something. Never used that parameter, I used to replace it with nullptr, and do the same as mhagain said, but what I never understood was why when they deprecate something, still has left overs. Like matrices, they are some kind a ghost in some ways...

  4. #4
    Member Regular Contributor
    Join Date
    Jul 2012
    Posts
    425
    Quote Originally Posted by DirtyBlasion View Post
    why when they deprecate something, still has left overs. Like matrices, they are some kind a ghost in some ways...
    Well, you have it in your own sentence. Something deprecated does not mean removed. What you might be interested in, is a GL header by version. Slightly different, you might be interested in the header glcorearb.h where only core functions and enums are available.

  5. #5
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,156
    Quote Originally Posted by richman.feynard View Post
    Why is it so? In OpengGL 2 it worked, as far as I remember.
    Just create an OpenGL compatibility profile context of any version, and it should work fine. That's the default type of context created unless you tell OpenGL which kind you want.
    Last edited by Dark Photon; 11-02-2017 at 07:07 PM.

  6. #6
    Junior Member Newbie
    Join Date
    Jul 2017
    Posts
    14
    Thanks for your replies.

    Quote Originally Posted by Dark Photon View Post
    Just create an OpenGL compatibility profile context of any version, and it should work fine. That's the default type of context created unless you tell OpenGL which kind you want.
    Is it good to mix different versions of OpenGL?

  7. #7
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,156
    Quote Originally Posted by richman.feynard View Post
    Is it good to mix different versions of OpenGL?
    It's fine. OpenGL isn't like Direct3D where they erase the board and start over for each new version (only game developers will put up with being jerked around like that). OpenGL versions build incrementally on each other. That way, older OpenGL programs will often still compile and work just fine on a newer GL driver.

    The main reason why you might want to choose to stop using functionality exposed by older version in favor of functionality exposed by a newer version is that in some cases the latter provides better performance (classic example: switching from "immediate mode" to vertex arrays).

  8. #8
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,716
    Quote Originally Posted by richman.feynard View Post
    Is it good to mix different versions of OpenGL?
    Because each version of OpenGL (aside from 1.0) is incrementally built on top of the prior version, by definition you're always mixing OpenGL versions.

    For example: glDrawElements is GL 1.1, glBindBuffer is GL 1.5, glEnableVertexAttribArray is GL 2.0, so put them together and you're mixing versions.

  9. #9
    Junior Member Newbie
    Join Date
    Jul 2017
    Posts
    14
    Quote Originally Posted by mhagain View Post
    Because each version of OpenGL (aside from 1.0) is incrementally built on top of the prior version, by definition you're always mixing OpenGL versions.

    For example: glDrawElements is GL 1.1, glBindBuffer is GL 1.5, glEnableVertexAttribArray is GL 2.0, so put them together and you're mixing versions.
    Then I don't understand why it doesn't work now, like I said at the beginning of the thread?

  10. #10
    Intern Newbie
    Join Date
    Mar 2017
    Posts
    47
    I think GClements gave a nice explanation to 'DrawElements' functionality. I don't know what version of OpenGL you're using, but I'll assume when you tried the code you made in your initial post, you were trying to do it in 3.3 core or above.

    I've only been learning 3.3 core, so I'll write about it from that perspective, maybe it'll help. As GClements said, the last argument in glDrawElements --3.3 core-- is an offset to the first element location (the first element you want to send for drawing, I mean) referenced in the currently bound element array buffer object (the state holding the currently bound element array buffer object is maintained by VAO's, which is why you bind a VAO before calling a drawing command like glDrawElements in your program). Here's what it says in 2.8.3 of the 3.3 core spec (DrawElementsOneInstance is equivalent to DrawElements in the spec, it's written as DrawElementsOneInstance for spec. writing purposes):

    The command 'void DrawElementsOneInstance( enum mode, sizei count, enum type, const void *indices )' ... constructs a sequence of geometric primitives by successively transferring the count elements whose indices are stored in the currently bound element array buffer (see section 2.9.7) at the offset defined by --indices-- to the GL.
    (I added the dashes to indicate that means the "indices" arugment. Same thing in later quote I mention.)

    So in 3.3 core, that last argument is like an index argument.

    -----

    I've never tried OpenGL in specifically OpenGL 2, but I took a look at the 2.1 spec. Looks like the last argument to the DrawElements function actually wants an array of indices, unlike 3.3 core, which just wants a single index. Here's what it says in 2.8 of the 2.1 spec:

    The command 'void DrawElements( enum mode, sizei count, enum type, void *indices );' constructs a sequence of geometric primitives using the count elements whose indices are stored in --indices--.
    Interestingly, looks like OpenGL 2 wants that argument to be an array. As GClements puts it, in this case, the function may treat that last argument as something you pass a pointer to client memory (which makes sense, given that you would pass an array you define in your program [which resides in what's called your client], and in passing an array in C/C++ you're technically passing a pointer to the first element of the array [C/C++ element in this sense, not OpenGL element]). I'm not really sure what GClements means about having "no buffer bound to that target", I've never tried OpenGL 2. In 3.3 core, I've found I must bind a VAO whose state includes reference to a bound element array buffer object. But nonetheless, I can see from the spec there's indication of passing an array in OpenGL 2.

    ------

    So ya, looking form the specs, there seems to be a difference in DrawElements functionality in OpenGL 2 and OpenGL 3.3 core, but this may be because of the idea of having a core profile in OpenGL 3 (OpenGL 2 doesn't have the notion of core or compatibility profiles). I glanced at the OpenGL 3.3 compatibility spec, looks like it mentions the "array" thing like in the GL 2 spec. I guess in general, the compatibility profile for 3.3 allows you to either pass an array or an index depending on whether a certain buffer object is bound (again, not sure what specifically, guessing in 3.3, it must involve whether a VAO is bound or not before the glDrawElements call). And the core profile, in attempting to be more modern, doesn't allow that "array" passing stuff.

    ------

    Overall, what I'm seeing from the specs, in OpenGL 2, for DrawElements, you pass an array. In 3.3 core, you must pass an index. So I can see how if you tried to pass an array like you did in 3.3 core, you wouldn't get the same results as in OpenGL 2. In fact, since passing an array is like passing a pointer, 3.3 core would probably just be interested in the first element of the array you passed and not care about the rest of the array.

    And In 3.3 compatibility, I think you're able to do both: either pass an array or pass an index, depending on some certain conditions. Hope that helps with clarification.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •