Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 2 of 2 FirstFirst 12
Results 11 to 17 of 17

Thread: glDrawArrays() Seg Fault

  1. #11
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,743
    Quote Originally Posted by whiack View Post
    I tired to load glFinish() but I cannot load it with wglGetProcAddress nor GetProcAddress(opengl32,procName). Where can I get the glFinish()?
    Link to opengl32.lib, #include <gl/gl.h> and you do not need to use either wglGetProcAddress or GetProcAddress.

    From the way this thread is going, it really reads as though you're just making things difficult for yourself. Before you consider porting an OpenGL program to Windows, you should learn how OpenGL actually works on Windows. A tip: if your code crashes and you blame the OS, it's probably not the OS; it's probably your code.

    Some helpful links:
    https://www.opengl.org/wiki/Getting_Started
    https://www.opengl.org/wiki/Platform_specifics:_Windows

    For what it's worth, I suspect that you're probably trying to load function pointers manually but have the wrong DLL calling conventions. But it's virtually impossibly to help you because you're drip-feeding information, what you give us is incomplete and you seem to have your own wrapper around things.

  2. #12
    Junior Member Newbie
    Join Date
    May 2016
    Posts
    18
    No, I didn't use GL_ARRAY_BUFFER.

    I solved this problem by setting the NVIDIA thread optimization off.

  3. #13
    Junior Member Newbie
    Join Date
    May 2016
    Posts
    18
    Yes, the project has a lot of files and I'm respond for wgl part of it, therefore, I need to use getprocaddress since it is the way how linux run it. But thanks a lot.

  4. #14
    Senior Member OpenGL Pro
    Join Date
    Jan 2007
    Posts
    1,743
    No, you didn't solve the problem.

    You found a workaround that works for you right now. You have no guarantee that it will work for anybody else and the underlying cause is still in your code.

    Let me emphasise this. glDrawArrays, on NVIDIA, under Windows, and when set up and used correctly works just fine without any hacky workarounds.

    You still have a problem in your code, and it's going to blow up in your face at some point in the future.

  5. #15
    Junior Member Newbie
    Join Date
    May 2016
    Posts
    18
    But Im still not sure where the problem is.

    This is what I saw other people's answer:
    VBO setup problem, but I'm not using VBO
    out of bond, this might be the cases, im in windows, i cannot use valgrind. Which memory debug tool is the best in windows?
    glVertexAttribPointer(0, 2, DATA_FORMAT_FLOAT32, false, 2 * sizeof(pos[0]), data) data should be 0, but I have tried, when data is 0, the GL_ARRAY_BUFFER should be set. Then there is a loop, the program is not using VBO, there is no glBindBuffer().

  6. #16
    Member Regular Contributor
    Join Date
    May 2016
    Posts
    451
    Quote Originally Posted by whiack View Post
    ... the program is not using VBO, there is no glBindBuffer().
    if there is no "glBindBuffer(...)", how can glDrawArrays(...) know what array of data it should draw ???
    that makes no sense ! there must be anywhere a glBindBuffer(), at least when you set up your vertexarray

    example
    Code :
    void Init()
    {
    // create shader
     
    	float vertices[] = {
    		// x    y     z             u     v             normal
    		0.0f, 0.0f, 0.0f,		0.0f, 1.0f,		0.0f, 0.0f, 1.0f,
    		1.0f, 0.0f, 0.0f,		1.0f, 1.0f,		0.0f, 0.0f, 1.0f,
    		0.0f, 1.0f, 0.0f,		0.0f, 0.0f,		0.0f, 0.0f, 1.0f,
    	};
     
    	glGenVertexArrays(1, vao);
    	glBindVertexArray(vao);
     
    	glGenBuffers(1, vbo);
    	glBindBuffer(GL_ARRAY_BUFFER, vbo);
    	glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
     
    // BEFORE you specify any attribute pointers, there must be a buffer bound from which to read the data !!!
    // no GL_ARRAY_BUFFER, no data source
    	glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(float) * 8, (void*)(sizeof(float) * 0));
    	glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, sizeof(float) * 8, (void*)(sizeof(float) * 3));
    	glVertexAttribPointer(2, 3, GL_FLOAT, GL_FALSE, sizeof(float) * 8, (void*)(sizeof(float) * 5));
    // now all 3 attributes use the same buffer "vbo", because it was bound when the pointers where specified	
     
    	glBindBuffer(GL_ARRAY_BUFFER, 0);
     
    	glBindVertexArray(0);
    }
     
     
    void render()
    {
    	glUseProgram(shader);
     
    	glBindVertexArray(vao);
     
    // when you enable an attribute, how does openGL "know" from where to get the data ???
    // answer: see above
    	glEnableVertexAttribArray(0);
    	glEnableVertexAttribArray(1);
    	glEnableVertexAttribArray(2);
     
    	glDrawArrays(GL_TRIANGLES, 0, 3);
     
    	glDisableVertexAttribArray(0);
    	glDisableVertexAttribArray(1);
    	glDisableVertexAttribArray(2);
    	glBindVertexArray(0);
     
    	glUseProgram(0);
    }

  7. #17
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,574
    Quote Originally Posted by john_connor View Post
    if there is no "glBindBuffer(...)", how can glDrawArrays(...) know what array of data it should draw ???
    that makes no sense ! there must be anywhere a glBindBuffer(), at least when you set up your vertexarray
    In the compatibility profile, attribute arrays can be stored either in buffer objects or in client memory. If no buffer is bound to GL_ARRAY_BUFFER, the data argument is treated as a pointer to client memory (that's why it's a void* rather than an integer).

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •