Problems about allocating 3D texture size of 1G

Hi, everyone! I’m doing some work on Volume Rendering. I want to allocate a 3D luminance texture of 1024x1024x1024 uchar. Unfortunately it always fails.
By adding glGetError() after glTexImage3D(…), I get the error code 1285, which means “Out of memory”.
However, my card is NV quadro 4800, whose memory size is 1536MB, larger than the texture size above(1GB). The driver version of the card is 296.88. The version of glew is the latest version 1.8.
My code about allocating the texture is shown as follows:


	glBindTexture(GL_TEXTURE_3D, texVoxels);
	glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE, volumeSize.x, volumeSize.y, volumeSize.z, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, voxels);

	int err = glGetError();
	printf("%d
", err);

What’s the problem of this?

What do you get for glGetIntegerv (GL_MAX_3D_TEXTURE_SIZE​, …)?

It’s important to realise here that video RAM allocations for resources don’t work quite the same way as a malloc call; the amount of memory you have is only one factor, and whether resource creation succeeds or fails can be influenced by hardware capabilities too.

Thanks for reply. :slight_smile:
By glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, dim), it returns 2048.
BTW, I also do some work using CUDA. There’ s no problem when I allocate 2048 cubic uchar texture in CUDA.

Problem still unsolved…:sorrow::sorrow:

GL_LUMINANCE is rather vague. You are asking for the driver to chose a format for you. Try GL_LUMINANCE8.

glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE8, volumeSize.x, volumeSize.y, volumeSize.z, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, voxels);

I have seen this memory type of issue before. For example,

and a bunch of others that I can’t find right now.

Also, when you ask GL for the maximum texture size supported, it doesn’t mean you can make a 2048 x 2048 x 2048 texture.
You could probable create a 2048 x 512 x 64 texture. Unfortunately, GL doesn’t have a good error reporting system to tell us what the reason is.
Perhaps you can give the “debug” extensions a try :
GL_ARB_debug_output
GL_AMD_debug_output

Thanks for reply. :slight_smile:
I use GL_LUMINANCE8 instead, but it still doesn’ t work.

BUT!!! The link you posted does help a lot!
I used to compile the code under win32 mode. When I turn to using x64 mode to compile and run, it eventually WORKS !! And no matter GL_LUMINANCE or GL_LUMINANCE8, it works.
It seems like the device can’ t address video memory larger than 600MB(more or less).

Anyway, thanks a lot. :slight_smile: