Hi OpenGL forums, this is my first post here! This place has helped me quite a bit in the past thanks to google searches that come here, so thank you. I hope someone can help with this one. The big picture is that I’m implementing Chris Wyman’s Adaptive Caustic Mapping algorithm (his page here) as part of a larger project, which will combine it with some real-time global illumination algorithms.
Right now when I run my program I get random pixels around the entire screen that turn random colors, and some of them even change color over time. This is true for pixels both inside AND outside the actual running program window (i.e. pixels in firefox and even on the desktop itself change). The pixels stay until I refresh the window where they show up. The exact line of code where this happens has this call:
glGetQueryObjectiv(queryID, GL_QUERY_RESULT_AVAILABLE, &yesorno);
This is in a loop that runs until yesorno is true, after which I get the value I’m looking for with:
glGetQueryObjectiv(queryID, GL_QUERY_RESULT, &primCount);
If I comment out the loop and the checking for whether primCount (which holds the number of vertices written out with transform feedback) is available, that second line still causes the random pixels to show up. The transform feedback code section, a few lines above the glGetQueryObjectiv line is this:
glBeginQuery(GL_TRANSFORM_FEEDBACK_PRIMITIVES_WRITTEN, queryID);
glBeginTransformFeedback(GL_POINTS);
// Go ahead and draw
glDrawArrays( GL_POINTS, 0, inputPrims );
glEndTransformFeedback();
glEndQuery( GL_TRANSFORM_FEEDBACK_PRIMITIVES_WRITTEN );
inputPrims is a VBO that holds a set of somewhat random x and y values between 0 and 1. It is created with this code:
glGenBuffers(1, &genericTraversalStartBuffer);
int resolution = 64;
float *causticStartPoints = (float *)malloc( resolution * resolution * 4 * sizeof( float ) );
for (int i = 0; i < resolution * resolution; i++)
{
int x = (i % resolution);
int y = (i / resolution);
causticStartPoints[4*i+0] = x/(float)resolution;
causticStartPoints[4*i+1] = y/(float)resolution;
causticStartPoints[4*i+2] = causticStartPoints[4*i+3] = 0;
}
glBindBuffer( GL_ARRAY_BUFFER, genericTraversalStartBuffer);
glBufferData( GL_ARRAY_BUFFER, resolution*resolution*4*sizeof( float ), causticStartPoints, GL_STATIC_DRAW );
glBindBuffer( GL_ARRAY_BUFFER, 0 );
That code is taken straight from the Wyman’s code download. inputPrims above is set to the ID of genericTraversalStartBuffer in its function - they are the same.
primCount does contain a value, which is wrong for what I want, but it is at least something. Perhaps if I can figure out the random pixels it will also solve the problem of incorrect counts. Some possibly important information about my rig, which is a nearly-new 17 inch macbook pro:
OS: Windows 7
Video Card: AMD Radeon HD 6750M
OpenGL version according to glGetString(GL_VERSION): 4.1.10428 Compatibility Profile Context
GLSL version found with the same method: 4.10
Dev Environment: Visual C++ 2005 Express
Windowing is done with SDL
Thanks for any help!