PDA

View Full Version : glTexImage3D limit - alternative?



jacktca
11-17-2011, 03:28 PM
I inherited a program which uses glTexImage3D to render and manipulate a 3D image. Works fine. Only problem is that there is a limit of 512x512x512 (widthxheightxdepth) = roughly 1.35GB. When I render an image that is 768x960 I have to reduce the depth. When I limit the depth topographic lines appear that kind of foul up the image. When I exceed the limits by setting a depth closer to 512 a white box appears instead of graphics.

My question is how do I get around this glTexImage3D limit? Is there an alternative that I can use to do 3D rendering and manipulating?

Alfonse Reinheart
11-17-2011, 04:31 PM
Only problem is that there is a limit of 512x512x512 (widthxheightxdepth) = roughly 1.35GB.

512*512*512*4bytes per pixel should only be 512MB or so, not 1.35GB.


My question is how do I get around this glTexImage3D limit?

You can try rendering to multiple 3D textures. Though displaying this will be tricky, depending on how you render them.

mhagain
11-18-2011, 05:56 AM
The limit is imposed by your graphics hardware, not by OpenGL. If your hardware is physically incapable of handling a larger texture then there is no easy way around it - you'll just need to use smaller textures.

1.35 GB for a 512x512x512 texture sounds suspiciously like you're creating a 64-bit texture with mipmaps, so there may be scope for reducing the texture memory footprint there, but you won't be able to increase the max dimensions.

jacktca
11-22-2011, 12:38 PM
Right. The input is unsigned short = 2 x byte. Sorry I didn't mention that.

jacktca
11-22-2011, 12:52 PM
I have never heard of mipmaps. I'm feeding glTexImage3D an array of unsigned short values, see below.

unsigned short *data;

data is read in from a file...

glTexImage3D(GL_TEXTURE_3D, 0,GL_RGBA8 , WIDTH, HEIGHT, depth, 0, GL_RGBA, GL_UNSIGNED_SHORT, data);

WIDTH=512
HEIGHT=512
depth=512

Anything more than 512 and a white cube comes up instead of an image.

This 512**3 limit exists on a computer with an NVIDIA Quadro 5600 with 1.5GB RAM as well as an NVIDIA GeForce 560 with 2GB RAM. The video card and RAM does not appear to be the deciding factor.

So anyway are there any alternatives to glTexImage3D?

Alfonse Reinheart
11-22-2011, 01:34 PM
So anyway are there any alternatives to glTexImage3D?

*ahem*: "You can try rendering to multiple 3D textures. Though displaying this will be tricky, depending on how you render them."

That's about it. If your OpenGL driver doesn't let you make bigger textures, then you're not getting bigger textures. Your only alternative is to work around it by creating multiple textures.

jacktca
11-22-2011, 04:46 PM
Oh, wow, that's interesting. So I might be limited by my NVIDIA opengl driver? Can anyone suggest a manufacturer/card/driver which would not have the same limitations?

By the way, I changed the input to GL_BYTE instead of GL_UNSIGNED_SHORT. No difference. The 512**3 limit still applies.

Alfonse Reinheart
11-22-2011, 05:34 PM
an NVIDIA GeForce 560 with 2GB RAM.

Really? OpenGL 4.1 (which the 560 supports) is required to have a GL_MAX_3D_TEXTURE_SIZE of no less than 2048. What do you get when you query that value?

BionicBytes
11-23-2011, 03:35 AM
Both my nVidia Geforce 8600m (laptop chip set) and Desktop Radeon 4800 have GL_MAX_3D_TEXTURE_SIZE of 2048, as reported by OpenGL 3.3 Compatability profile.
Have you updated your drivers?

mobeen
11-23-2011, 05:06 AM
Does your data contain 4bytes per voxel? As far as I have encountered all medical datasets are usually 1-2 bytes per voxel and usually I load my datasets using something like this,


glTexImage3D(GL_TEXTURE_3D, 0,GL_INTENSITY , WIDTH, HEIGHT, depth, 0, GL_LUMINANCE, GL_UNSIGNED_SHORT, data);

rather than GL_RGBA internal format. Could u double check your dataset may be it is like this?

Another thing u should do is check the error bit before and after the call to glTexImage3D like this

GLuint error = glGetError();
glTexImage3D(GL_TEXTURE_3D, 0,GL_INTENSITY , WIDTH, HEIGHT, depth, 0, GL_LUMINANCE, GL_UNSIGNED_SHORT, data);
error = glGetError();
...

and tell us the error code that you get.

jacktca
11-23-2011, 04:23 PM
I tried mobeen's INTENSITY and LUMINANCE settings. Screen comes out black when I use them. Sorry.

My GL_MAX_3D_TEXTURE_SIZE using glGet is 2048

However, does that mean 2048x2048x2048 heightxwidthxdepth?
I believe that limit only applies to height and width. It does not apply to depth according to the documentation. My images are 768x960 widthxheight. When I specify a depth of anything larger than about 200 or so the white cube appears instead of a nice 3d image that can be manipulated.

My drivers are the opengl drivers that nvidia provides, compatible with OpenGL 4

Alfonse Reinheart
11-23-2011, 05:09 PM
It does not apply to depth according to the documentation.

According to what documentation? Provide a link.

mobeen
11-24-2011, 12:15 AM
My images are 768x960 widthxheight

Hmm this tells me that you donot have a typical dataset that we usually have like a raw CT/MRI which is dumped into a binary file. I suggest you round the width and height to the nearest power of 2 (1024 in this case).

Some more things to ask you jacktca.
1) What does glGetError return before and after the call to glTexImage3D?
2) Could u show us the texture parameters that you are using for the 3d texture at the moment. May be u r specifying something wrong?
3) If u r using a shader, how are u passing this texture to it?