Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 3 of 10 FirstFirst 12345 ... LastLast
Results 21 to 30 of 92

Thread: Official feedback on OpenGL 4.1 thread

  1. #21
    Advanced Member Frequent Contributor
    Join Date
    Apr 2009
    Posts
    578

    Re: Official feedback on OpenGL 4.0 thread

    I just want to say thank you to the people who made the GL 4.1 specification, and that it happened so quickly. A major thank you for putting separate shader objects in! The following is not a request for features, but just in the spec:

    There are a lot of shaders running around now, I would love to see a diagram of the "pipeline" of such in the GL specification or maybe on the reference card? Something that is in GL_ARB_tessellation_shader.txt in the answer to Issue (1).

    and again thank you!

  2. #22
    Super Moderator Frequent Contributor Groovounet's Avatar
    Join Date
    Jul 2004
    Posts
    934

    Re: Official feedback on OpenGL 4.0 thread

    If we use transform feedback or glBindAttribLocation(is it a really problem?) and maybe a couple of other linking related cases... we can't use separate shader or with a big work around. As I am I Siggraph I didn't had time to actually have a deep look at everything so I might have missed something here.

    Quote Originally Posted by Alfonse Reinheart
    I had a lot of expectations with the separate shader but the extension feels quite limited in lot of senario so that we have to use a unified program.
    Limited in what way? It even gives you a choice of rendezvous by resource or by name. I don't know how it could be more flexible. Encapsulating combinations in state objects only makes sense.

  3. #23
    Administrator Regular Contributor
    Join Date
    Aug 2001
    Location
    NVIDIA, Fort Collins, CO, USA
    Posts
    184

    Re: Official feedback on OpenGL 4.0 thread

    Quote Originally Posted by kRogue

    There are a lot of shaders running around now, I would love to see a diagram of the "pipeline" of such in the GL specification or maybe on the reference card? Something that is in GL_ARB_tessellation_shader.txt in the answer to Issue (1).

    and again thank you!
    Your wish is our command :-) Seriously, here is what I have right now (see attachment). Is this what you are after?

    Barthold
    (with my ARB hat on)

  4. #24
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948

    Re: Official feedback on OpenGL 4.0 thread

    If we use transform feedback or glBindAttribLocation(is it a really problem?) and maybe a couple of other linking related cases... we can't use separate shader or with a big work around. As I am I Siggraph I didn't had time to actually have a deep look at everything so I might have missed something here.
    I don't see any unpleasant interactions with either of these. The only potential problem is if you want to use glCreateShaderProgram all the time. Since it goes directly from source strings to linked program object, there's no chance to call any of the pre-link settings functions. However, separate shader objects doesn't require that you use glCreateShaderProgram; it's simply a convenience function for those who's needs are simple enough to allow them to use it.

    It would however be great if there were a version of glCreateShaderProgram that didn't perform compiling or linking. That it returned some kind of object that you could call the pre-link settings functions on and the call another function to do the compiling and linking.

    Either that, or set up transform feedback parameters in the shader source somehow.

  5. #25
    Super Moderator Frequent Contributor Groovounet's Avatar
    Join Date
    Jul 2004
    Posts
    934

    Re: Official feedback on OpenGL 4.0 thread

    One other thing I wonder: would if a reasonable to release a project with just the shader binaries? For example: If a build an OpenGL 3.3 software with GLSL 3.3 shaders, could we just have one binary for AMD and one for nVidia? Would there still work over time... with newer drivers (and updated GLSL compilers)?

    If not, is the use case scenario is more: build once when we install the software or build when the shader binary load fail for any reason (updated drivers, updated graphics card, ...)?

  6. #26
    Advanced Member Frequent Contributor
    Join Date
    Apr 2009
    Posts
    578

    Re: Official feedback on OpenGL 4.0 thread

    Yes! That is the kind of diagram I am looking for.. this looks familiar though, was it in a power point or something when GL4 was first released (just in March)?

    Though, it is not exactly what I am begging for, but so close that chances are I am getting greedy:

    What I would like, and the difference I am asking is really, really tiny is like this [so tiny I feel almost ashamed to ask].

    For each shader stage:
    explicit arrows (that pdf has this in the long line broken into 4 bits og pages 2 and 3), but the difference being that the arrows are marked as in, out, in patch, out patch, etc. The other bit (and that pdf has this too to some extent) is "something" when one of the optional shaders are not part of a GLSL program. Being so shader oriented, a diagram without the compatibility pipeline too.

    I'd imagine the picture I am after might be more pages though and one can make a pretty strong case that what I am asking for is just a tiny, tiny (epsilon) difference to than what you just gave. The core of what I am begging for is some text for the arrows but the text being what one writes in GLSL (and to a lesser extent GL)..

  7. #27
    Member Regular Contributor
    Join Date
    Apr 2006
    Location
    Irvine CA
    Posts
    299

    Re: Official feedback on OpenGL 4.0 thread

    Quote Originally Posted by Groovounet
    One other thing I wonder: would if a reasonable to release a project with just the shader binaries? For example: If a build an OpenGL 3.3 software with GLSL 3.3 shaders, could we just have one binary for AMD and one for nVidia? Would there still work over time... with newer drivers (and updated GLSL compilers)?

    If not, is the use case scenario is more: build once when we install the software or build when the shader binary load fail for any reason (updated drivers, updated graphics card, ...)?
    The latter. The binary shader approach in GL 4.1 is not a distribution format. It enables you to cache a compiled shader for re-loading at a later time on the same machine. OpenGL is free to deny that request for any reason in which case you would need to resubmit source to compile the shader (and then you could re-query and re-save the binary).

  8. #28
    Member Regular Contributor
    Join Date
    Nov 2003
    Location
    Germany
    Posts
    293

    Re: Official feedback on OpenGL 4.0 thread

    Ok, i will be the first to call for updated header files on the registry page. I am itching to try some of the new features.

    -chris

  9. #29
    Member Regular Contributor CrazyButcher's Avatar
    Join Date
    Jan 2004
    Location
    Germany
    Posts
    401

    Re: Official feedback on OpenGL 4.0 thread

    great work, with the separate shaders and binary shaders finally around! Good to see the match of dx11 now, I guess the only major thing missing is the threaded resource manipulation.

    someone please update GLEW

  10. #30
    Member Regular Contributor
    Join Date
    Mar 2001
    Posts
    468

    Re: Official feedback on OpenGL 4.0 thread

    I am building glew bymyself from the SVN. It lackes 4.1 core functions, and I had to disable ARB_cl_event, as well as manually tewak the debug output callback. I'm fine with that, since I only care about the debug functionality :P

    Also, is there a way to find out whether a context was created with the debug bit set?

    It seems that glGetInteger(GL_CONTEXT_FLAGS) will only return the GL_CONTEXT_FLAG_FORWARD_COMPATIBLE_BIT (0x01) (which has a different value than WGL_CONTEXT_FLAG_FORWARD_COMPATIBLE_BIT (0x02), so I cannot use the WGL_CONTEXT_DEBUG_BIT_ARB (0x01))

    I am currently hacking QT to create a debug context, where they use GL_CONTEXT_FLAGS to see whether the requested context matches what they wanted to.

    I think this might be also useful in cases where a middleware/ library has some debugging facilities, but doesn't control context creation.


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •