About number of texture image units

Hi everyone,

I’m running Leopard 10.5.5. My application need to use as much texture image units as possible. However, according to information from Apple webpage http://developer.apple.com/graphicsimaging/opengl/capabilities/index.html , it seems like they “hard-coded” the number of texture image units (MAX_TEXTURE_IMAGE_UNITS_ARB) to 16. Is there any way to overcome this limitation? (I know for sure that my graphics card support 32 texture image units).

Bests.

In the link you sent, there is:
"
Notes

* This data describes functionality only. Actual rendering results may differ across renderers with identical reported capabilities; always verify your results on the real hardware.

"

So, what do you have for real?

Also, you’d better check for GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS_ARB instead of GL_MAX_TEXTURE_IMAGE_UNITS_ARB.


GLint value;
glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS_ARB,&value);

Hi,

I check GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS_ARB, GL_MAX_TEXTURE_IMAGE_UNITS, and GL_MAX_TEXTURE_IMAGE_UNITS_ARB. The results are the same: 16. I’m using the Geforce 8800GT 512MB. From this website (http://www.gpureview.com/GeForce-8800-GT-card-544.html), this card is listed to have 56 (64) texture units. I managed to install Gentoo Linux and use the Nvidia driver 177.80 on this machine and run the same OpenGL code, I have the results of 32. This is quite strange to me since Mac OS support OpenGL from inside out (if I’m not wrong). Do you have any suggestion?

Bests.

http://bugreport.apple.com/

We ran into this too. The advertised limit is what you can get to with fixed function texturing. I think you can get to more if you are all-shader.

Rob, you are talking about GL_MAX_TEXTURE_UNITS which is fixed function only.

cvision is talking about GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS and GL_MAX_TEXTURE_IMAGE_UNITS which are all-shader.

The current driver limit really is 16, although the hardware can support 32. Please file a bug, like OneSadCookie suggested.

As for image units past 32, I don’t know about gpureview.com, but how do you propose binding an image? There are no GL enums defined past GL_TEXTURE31, so there is no way for you to bind a texture or set the sampler uniform to >31.

Also, if you don’t mind my asking-- what are you doing that needs more than 16 units?

I need to visualize multi volume so that the number of texture units can help a lot. Maybe I should wait for couple of months then try again with Mac.

Do you mind expanding on that a little bit?

Traditional volume visualization is usually done with 3D textures. Do you actually have more than 16 textures you need to sample from for a single fragment?

Yeah, I think you’re right. However, there are cases when you implement your own shading model (in my case, the number of textures that are used to produce the light map is higher than 16). That’s the reason why I need the full support for the number of texture units from the card. Recently, I’m running the framework on Linux and it needs more than 16 texture units to handle the task (maybe I should modify it to be more efficiently with the required number of texture units).

So, what happens when your algorithm expands and you need 100 textures? At some point, you need to decompose your rendering into multiple passes. Once you’ve done this, you can run on HW with 32 or 16, or 2 units.

I would suggest you take a look at using 3d Textures and upload as many layers as you want, but be warned that filtering will bleed into each layer. If Mac has support for texure arrays use them they don’t filter between layers and just ROCK plain and simple. Love using them.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.