Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 2 of 2

Thread: Gl memory barrier and dispatch compute crahses

  1. #1
    Newbie Newbie
    Join Date
    Jun 2018
    Posts
    1

    Gl memory barrier and dispatch compute crahses

    Hello everyone,

    I'm messing with compute shaders and my program crash when I call gl_memory_barrier(), because I have an access to 0x00000000 . I've search a little bit about what can happens, and i can not figure out why. Whenever i Comment the memory_barrier(), the same crash happens on dispatchcompute().

    I've created a class for compute shader, to test it.

    In my main I have the glfw and glad initialiser, and a the constructor for my compute shader:

    Code :
    GLFWwindow * mainWindow = nullptr;
    	glfwInit();
    	glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 4);
    	glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 0);
    	glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
    	glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
    	glfwWindowHint(GLFW_RESIZABLE, GL_FALSE);
    	mainWindow = glfwCreateWindow(800, 800, "OpenGL", nullptr, nullptr);
     
    	gladLoadGL();
     
    	// Check for Valid Context
    	if (mainWindow == nullptr) {
    		fprintf(stderr, "Failed to Create OpenGL Context");
    		throw EXIT_FAILURE;
    	}
     
    	// Create Context and Load OpenGL Functions
    	glfwMakeContextCurrent(mainWindow);
     
    cerr << "Testing compute shader class" << endl; 
     
    	ComputeShader ComputeS("ComputeShaderTest.glsl"); 
    	ComputeS.Compute();
    	cerr << "Was the execution succesful? " << endl; 
    	string a;
    	cin >> a;

    On my compute shader class, I compile and link the shader :

    Code :
    void ComputeShader::Init(string filepath)
    {
     
    	//Final step
    	if (!CompileAndLink(filepath)) // If the compilation goes wrong
    	{
    		cerr << "Program throwed" << endl;
    		string a; 
    		cin >> a;
    		throw(-15);
    	}
    	//Generate buffers
    		// Input Buffer
    	cerr << "Populating input buffer ... " << endl; 
    	for (unsigned int i = 0; i < 32; i++)_buff.inputBuffer[i] = i; 
    	cerr << "Input Buffer generating, binding and filling ..." << endl;
     
    	glGenBuffers(1, &_information.inputBuffer);
    	glBindBuffer(GL_SHADER_STORAGE_BUFFER, _information.inputBuffer);
    	glBufferData(GL_SHADER_STORAGE_BUFFER, sizeof(unsigned)* _buff.inputBuffer.size(), &_buff.inputBuffer[0], GL_DYNAMIC_DRAW);
     
    		//Output Buffer
    	cerr << "Output Buffer generating, binding and filling ..." << endl;
    	glGenBuffers(1, &_information.outputBuffer);
    	glBindBuffer(GL_SHADER_STORAGE_BUFFER, _information.outputBuffer);
    	glBufferData(GL_SHADER_STORAGE_BUFFER, sizeof(unsigned)* _buff.outputBuffer.size(), NULL, GL_STREAM_DRAW); //Not sure about GL_STREAM_DRAW, more info at: https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glBufferData.xhtml
     
    	cerr << "Choosing Memory Barrier..." << endl;
    	glBindBuffer(GL_SHADER_STORAGE_BUFFER, 0); // <--Not sure about this one, probably a mistake
     
    	glMemoryBarrier(GL_ALL_BARRIER_BITS); //HERE IS WHERE IT CRASHES
     
    }
     
     
    bool ComputeShader::CompileAndLink(string filepath)
    {
    	_information.shaderID = glCreateShader(GL_COMPUTE_SHADER);
    	if (!_information.shaderID)
    	{
    		if (_debug)
    			cerr << "Error creating shader" << endl;
     
    		return false; 
    	}
     
    	const GLchar * shaderProgram = fileToChar(filepath);//Carrega a meṃria del shader
    	glShaderSource(_information.shaderID, 1, &shaderProgram, NULL);
    	glCompileShader(_information.shaderID);
     
    	if (!CheckCompile())
    		return false;
     
    	//Creating the program
    	_information.programID = glCreateProgram();
    	glAttachShader(_information.programID, _information.shaderID);
    	glLinkProgram(_information.programID);
     
    	return true; 
    }


    I couldn't see my error, but I suspect that may be because I'm not compiling or linking the shader properly? .

    Any help will be grateful!

    Thanks!

  2. #2
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    6,031
    Here's the thing. The `glMemoryBarrier` call, in that location, serves no purpose. You haven't actually performed any writes that require such a barrier, so it doesn't accomplish anything.

    But at the same time, that doesn't mean it should crash. It seems like something hasn't been properly initialized. Are you sure that all OpenGL functions have been properly loaded? `glMemoryBarrier` is the first OpenGL 4.2 function you call; do you have a 4.2-capable context? After all, you only require 4.0, even though your code calls 4.2 functions.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •