PDA

View Full Version : About number of texture image units



cvision
12-03-2008, 05:23 AM
Hi everyone,

I'm running Leopard 10.5.5. My application need to use as much texture image units as possible. However, according to information from Apple webpage http://developer.apple.com/graphicsimaging/opengl/capabilities/index.html , it seems like they "hard-coded" the number of texture image units (MAX_TEXTURE_IMAGE_UNITS_ARB) to 16. Is there any way to overcome this limitation? (I know for sure that my graphics card support 32 texture image units).

Bests.

overlay
12-03-2008, 07:51 AM
In the link you sent, there is:
"
Notes

* This data describes functionality only. Actual rendering results may differ across renderers with identical reported capabilities; always verify your results on the real hardware.
"

So, what do you have for real?

Also, you'd better check for GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS_ARB instead of GL_MAX_TEXTURE_IMAGE_UNITS_ARB.



GLint value;
glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS_ ARB,&value);

cvision
12-03-2008, 03:39 PM
Hi,

I check GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS_ARB, GL_MAX_TEXTURE_IMAGE_UNITS, and GL_MAX_TEXTURE_IMAGE_UNITS_ARB. The results are the same: 16. I'm using the Geforce 8800GT 512MB. From this website (http://www.gpureview.com/GeForce-8800-GT-card-544.html), this card is listed to have 56 (64) texture units. I managed to install Gentoo Linux and use the Nvidia driver 177.80 on this machine and run the same OpenGL code, I have the results of 32. This is quite strange to me since Mac OS support OpenGL from inside out (if I'm not wrong). Do you have any suggestion?

Bests.

OneSadCookie
12-03-2008, 05:02 PM
http://bugreport.apple.com/

Rob Barris
12-03-2008, 11:40 PM
We ran into this too. The advertised limit is what you can get to with fixed function texturing. I think you can get to more if you are all-shader.

overlay
12-04-2008, 12:11 AM
Rob, you are talking about GL_MAX_TEXTURE_UNITS which is fixed function only.

cvision is talking about GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS and GL_MAX_TEXTURE_IMAGE_UNITS which are all-shader.

arekkusu
12-04-2008, 12:34 AM
The current driver limit really is 16, although the hardware can support 32. Please file a bug, like OneSadCookie suggested.

As for image units past 32, I don't know about gpureview.com, but how do you propose binding an image? There are no GL enums defined past GL_TEXTURE31, so there is no way for you to bind a texture or set the sampler uniform to >31.

Also, if you don't mind my asking-- what are you doing that needs more than 16 units?

cvision
12-05-2008, 07:19 AM
I need to visualize multi volume so that the number of texture units can help a lot. Maybe I should wait for couple of months then try again with Mac.

arekkusu
12-05-2008, 01:34 PM
Do you mind expanding on that a little bit?

Traditional volume visualization is usually done with 3D textures. Do you actually have more than 16 textures you need to sample from for a single fragment?

cvision
12-08-2008, 06:47 AM
Yeah, I think you're right. However, there are cases when you implement your own shading model (in my case, the number of textures that are used to produce the light map is higher than 16). That's the reason why I need the full support for the number of texture units from the card. Recently, I'm running the framework on Linux and it needs more than 16 texture units to handle the task (maybe I should modify it to be more efficiently with the required number of texture units).

arekkusu
12-08-2008, 11:05 AM
So, what happens when your algorithm expands and you need 100 textures? At some point, you need to decompose your rendering into multiple passes. Once you've done this, you can run on HW with 32 or 16, or 2 units.

Mars_999
01-31-2009, 12:41 AM
I would suggest you take a look at using 3d Textures and upload as many layers as you want, but be warned that filtering will bleed into each layer. If Mac has support for texure arrays use them they don't filter between layers and just ROCK plain and simple. Love using them.