PDA

View Full Version : Uniform block binding points



49er
08-04-2009, 04:20 PM
Hello,

Has anyone had any luck getting uniform block buffer binding points other than 0 to work? I'm mapping each block index (UniformBlockBinding) such that it has its own backing buffer (BindBufferBase). Both partial and complete buffer updates via BufferSubData and Map attempted.

Tried with both 190.38 (3.1) and 190.56b (3.1, 3.2) drivers on a NVG80.

Thanks

LangFox
08-04-2009, 07:49 PM
Yes, I do.



enum UNIFORM_BLOCK_BINDING
{
UBB_Camera = 0,
UBB_Light,
UBB_Mesh,
};

uniformBlock = program->GetUniformBlock("Camera");
if(uniformBlock != NULL)
program->UniformBlockBinding(uniformBlock->uniformBlockIndex, UBB_Camera);

uniformBlock = program->GetUniformBlock("Light");
if(uniformBlock != NULL)
program->UniformBlockBinding(uniformBlock->uniformBlockIndex, UBB_Light);

uniformBlock = program->GetUniformBlock("Mesh");
if(uniformBlock != NULL)
program->UniformBlockBinding(uniformBlock->uniformBlockIndex, UBB_Mesh);



The returned uniformBlock->uniformBlockIndex can be 0/1/2, running with 190.38(3.1) on G80. But It's failed with 190.56(3.2), because some function pointer(for example, glBindVertexArray) couldn't be loaded. I don't know why.

49er
08-05-2009, 01:53 AM
Thanks for the input.

Yes, I noticed trouble with VAO in the beta too.

With respect to UBOs I can create, bind and otherwise run the gamut of operations without errors of any kind. It's just that the data don't seem to show up in the shader for use in any block binding point but 0 (I get either 0s or utter gibberish).

sponeil
08-23-2009, 03:41 PM
I'm experiencing the exact same problem, and I have a bit more detailed info to add for anyone who might see something we've missed. I've tested this with a GTS250 on driver versions 190.38 and 190.62. I have wrappers around every GL call that throw an exception if glGetError() returns an error code, and no error code is being set anywhere in my program.

After linking any shader program, I run this code:


// Register and/or assign any uniform blocks this technique uses
int nActiveBlocks = 0;
gl.getProgramiv(this->getHandle(), GL_ACTIVE_UNIFORM_BLOCKS, &nActiveBlocks);
for(int nBlock=0; nBlock<nActiveBlocks; nBlock++) {
char szBlock[256];
int nSize;
this->getActiveUniformBlockName(nBlock, 256, NULL, szBlock);
this->getActiveUniformBlockiv(nBlock, GL_UNIFORM_BLOCK_DATA_SIZE, &amp;nSize);
UniformBufferObject *pUniformBuffer = gl.getManager()->createUniformBlock(szBlock, nSize);
this->uniformBlockBinding(nBlock, pUniformBuffer->getBindIndex());
}

Right now 3 shader programs get linked, and each runs through this loop. The first time the GL manager sees a uniform block with a specific name, it creates a new UniformBufferObject and automatically assigns it a unique bind index. On subsequent checks, it makes sure the size hasn't changed and returns the existing buffer object. Right now I'm only using one uniform block named "Transform", and I am trying to add another block. In my main game loop, I update the data in the "Transform" buffer and ensure that it is bound to its assigned index:


GL::Uniform::Transform t;
...
GL::UniformBufferObject *pTransform = gl.getManager()->getUniformBlock("Transform", sizeof(GL::Uniform::Transform));
pTransform->updateBlock(t);
pTransform->bindBlock();

When my shader program only declares one uniform block, everything works and renders perfectly. If I define a second uniform block in my shader programs (without using it), I get a blank screen. In the debugger, I see my "Transform" block being assigned index 1 instead of index 0, and this seems to be causing the problem. If I force "pTransform->bindBlock()" to use index 0, it starts working again (even though uniformBlockBinding told the driver to use index 1).

It seems to me like the driver is ignoring the numbers I pass to uniformBlockBinding and is always using binding 0 for all blocks. When I call glGetActiveUniformBlockiv with GL_UNIFORM_BLOCK_BINDING it returns 1, so I know I set it properly, but the shader will only use the values in my uniform buffer if I bind it to index 0.

Can anyone see anything I've missed, or does it seem like a driver bug? Named uniform blocks are useless if you can only use one of them. I wish I could post an small complete program duplicating the problem, but GL 3.1 requires way too much code for that.

Thanks,
Sean

sponeil
08-29-2009, 02:08 PM
I seem to have found the true nature of the problem, and it definitely seems to be a driver bug. I can even rig it so that I can get it to work, but it is kind of a pain (especially when it breaks unexpectedly). Here are the steps that led me to it:

1) I tried various combinations of setting up and using two blocks named Transform and Planet.
2) I always had Transform declared/included first because it's used by every shader. Planet is only used by some shaders.
3) I played around with the block names and order of declaration, and glGetActiveUniformBlockName/glGetUniformBlockIndex always gave me block indices arranged in alphabetical order.
4) However, glUniformBlockBinding() does not respect those indices. It only respects indices using the order the blocks are declared in.
5) If I declare my uniform blocks in alphabetical order, everything works.

Anyone know where I can submit a bug report to the driver developers?
Thanks,
Sean

Dark Photon
08-30-2009, 05:23 PM
Anyone know where I can submit a bug report to the driver developers?
Well, they monitor here so they might even post a request for a repro program to you directly. But here's some info from the Linux driver README.txt:


__________________________________________________ ____________________________

Chapter 29. NVIDIA Contact Info and Additional Resources
__________________________________________________ ____________________________

There is an NVIDIA Linux Driver web forum. You can access it by going to
http://www.nvnews.net and following the "Forum" and "Linux Discussion Area"
links. This is the preferable tool for seeking help; users can post questions,
answer other users' questions, and search the archives of previous postings.

If all else fails, you can contact NVIDIA for support at:
linux-bugs@nvidia.com. But please, only send email to this address after you
have explored the Chapter 7 and Chapter 8 chapters of this document, and asked
for help on the nvnews.net web forum. When emailing linux-bugs@nvidia.com,
please include the 'nvidia-bug-report.log.gz' file generated by the
'nvidia-bug-report.sh' script (which is installed as part of driver
installation).


For Vista driver feedback, see this link (http://www.nvidia.com/object/driverqualityassurance.html).

barthold
08-31-2009, 01:59 PM
Yes, I do.


But It's failed with 190.56(3.2), because some function pointer(for example, glBindVertexArray) couldn't be loaded. I don't know why.

Can you provide me a repro case (source code) I can compile and run to try to reproduce this?

Thanks,
Barthold
with my NVIDIA hat on

barthold
08-31-2009, 02:00 PM
I seem to have found the true nature of the problem, and it definitely seems to be a driver bug. I can even rig it so that I can get it to work, but it is kind of a pain (especially when it breaks unexpectedly). Here are the steps that led me to it:

1) I tried various combinations of setting up and using two blocks named Transform and Planet.
2) I always had Transform declared/included first because it's used by every shader. Planet is only used by some shaders.
3) I played around with the block names and order of declaration, and glGetActiveUniformBlockName/glGetUniformBlockIndex always gave me block indices arranged in alphabetical order.
4) However, glUniformBlockBinding() does not respect those indices. It only respects indices using the order the blocks are declared in.
5) If I declare my uniform blocks in alphabetical order, everything works.

Anyone know where I can submit a bug report to the driver developers?
Thanks,
Sean


Same question Sean. Can you provide me a repro case please?

Thanks,
Barthold
(with my NVIDIA hat on)

sponeil
08-31-2009, 05:03 PM
Same question Sean. Can you provide me a repro case please?

Thanks,
Barthold
(with my NVIDIA hat on)

I can give you full source code that reproduces it if you can point me to a clean OpenGL 3.1 template project I can add it to for submission. Right now my project has over 100 source files in it, and it is linked to customized versions of external libraries like GLEW (which I had to customize because it doesn't support GL 3.1 extensions yet). I'm not at a point where I can hand over the whole project, and it would take too long to just pull out the bare necessities needed to get a GL 3.1 project running.

If you just want to see the shader code, it's simple:


layout(std140) uniform Transform {
// Projection and view matrices
mat4 mProjection;
mat4 mOrtho;
mat4 mView;
};
layout(std140) uniform Planet {
vec4 vPlanetSize;
};


Add these to the top of a shader and try to bind buffers to them. It will not work. Swap the order the two are declared in, and both will work just fine.

ZbuffeR
09-01-2009, 02:00 AM
I can give you full source code that reproduces it if you can point me to a clean OpenGL 3.1 template project I can add it to for submission.

There are 3.0 and 3.1 context creation code samples on the wiki :
http://www.opengl.org/wiki/Tutorials#General
Does it helps ?

sponeil
09-01-2009, 10:56 AM
There are 3.0 and 3.1 context creation code samples on the wiki :
http://www.opengl.org/wiki/Tutorials#General
Does it helps ?

The "standalone" tutorials on there don't even mention extension loading, which means they will not compile and run on their own.

The SDL 1.3 tutorials may compile and run ok, but SDL 1.3 hasn't been released yet. In addition to the fact that I really don't like SDL, I don't see a beta or RC download for 1.3 on Windows. It would be a real pain for me to install cvs or svn, download the latest development source code, hunt down every library it's dependent on, get them all to build, and then make sure it actually works. That's never easy to do on Windows, and it would be a larger effort than pulling pieces of my own project out to make something new. ;-)

It would be a lot better if the nVidia SDK (or any web site, really) contained some sample projects using GL 3.1 features without requiring any third-party libraries. Then I could just tweak one and submit it.

Aleksandar
09-01-2009, 01:00 PM
The "standalone" tutorials on there don't even mention extension loading, which means they will not compile and run on their own.A tutorial for GL 3.1 uses GLEW (OpenGL Extension Wrangler Library) for extensions. The whole Visual Studio 2008 project can be downloaded from many URLs. (The most convenient is probably: http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-1---tutorial-01). There is no need anything to be installed. All library files are in the project's working folder.

Working with the extensions is not hard at all. The only problem is how to ensure that it will be supported on all target platforms and what to do if it is not. ;)

P.S. Unfortunately, GLEW 1.5.1 supports only OpenGL 3.0. So, if you need a pointer to some function that must be done manually (or the GLEW library has to be changed and rebuilt). But acquiring a pointer to a function is quite easy. Take a look at section Extensions on this page http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-2---tutorial-01. (http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-2---tutorial-01)
Maybe a sample project on this page will be more useful.

sponeil
09-01-2009, 04:19 PM
A tutorial for GL 3.1 uses GLEW (OpenGL Extension Wrangler Library) for extensions. The whole Visual Studio 2008 project can be downloaded from many URLs. (The most convenient is probably: http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-1---tutorial-01). There is no need anything to be installed. All library files are in the project's working folder.

Working with the extensions is not hard at all. The only problem is how to ensure that it will be supported on all target platforms and what to do if it is not. ;)

P.S. Unfortunately, GLEW 1.5.1 supports only OpenGL 3.0. So, if you need a pointer to some function that must be done manually (or the GLEW library has to be changed and rebuilt). But acquiring a pointer to a function is quite easy. Take a look at section Extensions on this page http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-2---tutorial-01. (http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-2---tutorial-01)
Maybe a sample project on this page will be more useful.


Thanks for the link. I will take a look at it. I didn't see a link to the full source on the OpenGL wiki. I only saw code snippets and explanation without any mention of extensions (I skimmed it, so I may have missed it).

I know loading extensions is easy. I've already created my own extension to GLEW 1.5.1 for OpenGL 3.1. However, it is a bit time-consuming to create a new project from scratch and work your way up to a working sample with bound uniform buffers. I have two jobs (full-time commercial programming and part-time game programming) and two young kids at home, so I have practically 0 time to spend on things like this. I'll see what I can do, though.

sponeil
09-01-2009, 07:40 PM
Same question Sean. Can you provide me a repro case please?

Thanks,
Barthold
(with my NVIDIA hat on)

I just emailed a zip to Barthold and explained how to reproduce it. It turned out to be easier to pull apart my own project than to use any of the others. It took more time than I wanted to spend on it, but at least now I have a simple bare-bones OpenGL 3.x project I can use for things like this.

barthold
10-05-2009, 11:23 AM
Driver 190.58 (Windows) and 190.18.05 (Unix) should have fixed this. You can download them here:

http://developer.nvidia.com/object/opengl_3_driver.html

Regards,
Barthold
(with my NVIDIA hat on)

elFarto
10-05-2009, 02:37 PM
Thanks Barthold, it seems to have fixed a few issues I was having.

Regards
elFarto

49er
10-07-2009, 04:59 AM
Yes, thank you very much for the update.