Transform exisiting CUDA OpenGL into normal OpenGL

Hi,
I’m very new to OpenGL. I’ve found a CUDA mandelbrot set example where OpenGL is used to show it on the display. I now wanted to program the same application but without use of CUDA. The sense is that I want to switch between CPU and GPU while executing.
But I don’t know how to transform the existing CUDA OpenGL instruction in that way, that I can use it for my CPU calculation.

The buffer and texture are initialised as following:


GLint bsize;  
	

glGenBuffers(1, &buffer);
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, buffer);
glBufferData(GL_PIXEL_UNPACK_BUFFER,Bpp * sizeof(unsigned char) * width * height, NULL, GL_STREAM_DRAW);	
glGetBufferParameteriv(GL_PIXEL_UNPACK_BUFFER, GL_BUFFER_SIZE, &bsize); 

	
if ((GLuint)bsize != (Bpp * sizeof(unsigned char) * width * height)) 
    {
        printf("Buffer object (%d) has incorrect size (%d).
",(unsigned)buffer, (unsigned)bsize);
        cudaThreadExit();
        exit(-1);
    }

glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0);  

cudaGLRegisterBufferObject(buffer);


glGenTextures(1, &texid); 
glBindTexture(GL_TEXTURE_2D, texid);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height,  0, GL_RGB, GL_UNSIGNED_BYTE, NULL);  
glBindTexture(GL_TEXTURE_2D, 0);

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_PACK_ALIGNMENT, 1);

Now Cuda uses it in the following way:


cudaGLMapBufferObject((void**)&data, buffer);
	
// Calling if the CUDA Hostfunction
    calculateHost( data, centerX, centerY, zoom, width, height );
	
cudaGLUnmapBufferObject(buffer);	

data is an pointer to unsigned char

I want to give my CPUfunction the same Parameters as the CUDA hostfunktion (unsigned char* data, centerX, centerY, zoom, width, height )

But I don’t know how the code it. :frowning: As far as I can say it the variable data is the problem.

You need to replace cudaGLMapBufferObject to glMapBuffer
This function will give you the address to a buffer in host memory.

Thank you very much!
But there’s a new problem, it doesn’t seem to work properly.

The new code is:


data = (unsigned char *) glMapBuffer( GL_PIXEL_UNPACK_BUFFER, GL_READ_WRITE );

calculateHost( data, centerX, centerY, zoom, width, height );
		
glUnmapBuffer( GL_PIXEL_UNPACK_BUFFER );

data points now to 0x00000000 and I don’t know why.

Did you bind buffer before that?
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, buffer);

Thank you so much! :slight_smile: Great!