it stores color from one of the u_texDiffuse textures into gl_FragColor, depending on some mask texture (u_texMask textures contain 1.0 in one channel and 0.0 in all others). But actual result is very strange - it looks like some of the diffuse and mask textures are swapped.
Here is ShaderDesigner project to demonstrate this problem (you can easily see what textures are bound to correct units): ShaderDesigner project (22 KB)
Originally posted by execom_rt:
[b]
Also, I think that there is only 4 texture units on Geforce FX (Only the ATI has 6 textures units and even 8 for FireGL series).
And it seems that you are trying to access to 5 textures units, which is more than supported.[/b]
You are right, FX has 4 texture units, but has 8 texture images, so, from in a shader, the developer can access up to 8 textures (but only will have 4 texture interpolators)
GeForce FX and GeForce 6 series GPUs have 8 texture coordinates and 16 image units. So you can fetch from up to 16 unique textures in a fragment shader/program.
grisha, are you setting the samplers to the correct texture units using glUniform1iARB?
Originally posted by jra101:
grisha, are you setting the samplers to the correct texture units using glUniform1iARB?
sampler uniforms are in arrays, so I’m using glUniform1ivARB to set entire array with one call:
Originally posted by jra101: GeForce FX and GeForce 6 series GPUs have 8 texture coordinates and 16 image units. So you can fetch from up to 16 unique textures in a fragment shader/program.
from reading this it seems like i can do on my gffx
glCurrectActiveTexture( GL_TEXTURE6 );
glBindTexture( … )
Originally posted by jra101:
[b]GeForce FX and GeForce 6 series GPUs have 8 texture coordinates and 16 image units. So you can fetch from up to 16 unique textures in a fragment shader/program.
grisha, are you setting the samplers to the correct texture units using glUniform1iARB?[/b]
I was almost right , well, I tried this shader on my machine (Wildcat Realizm 100) and I get a mesh with many numbers mapped (1,2,3,4,5,1,2,3,4,5, etc), here is a screenshot to see if is the desired effect: http://www.typhoonlabs.com/~ffelagund/2.png (I think so)
It is strange, the only reason for this message is that the Shader Designer is getting the OpenGL MS 1.1 implementation or is unable to create the OpenGL context. Perhaps there is something wrong in your drivers, but if you have MSN we can trace the error (if you want, send your msn mail to jacobo.rodriguez ‘at’ typhoonlabs.com)
Originally posted by Ffelagund: here is a screenshot to see if is the desired effect: http://www.typhoonlabs.com/~ffelagund/2.png (I think so)
yes, this is a correct image.
Originally posted by Ffelagund: It is strange, the only reason for this message is that the Shader Designer is getting the OpenGL MS 1.1 implementation or is unable to create the OpenGL context. Perhaps there is something wrong in your drivers, but if you have MSN we can trace the error (if you want, send your msn mail to jacobo.rodriguez ‘at’ typhoonlabs.com)
Shader Designer doesn’t appear to be parsing the version string properly to handle version 2.0
When I’ve seen the problem before, the software was just checking the digit after the decimal and doing the wrong thing.
Originally posted by esw:
[b] [quote]Originally posted by Ffelagund: It is strange, the only reason for this message is that the Shader Designer is getting the OpenGL MS 1.1 implementation or is unable to create the OpenGL context. Perhaps there is something wrong in your drivers, but if you have MSN we can trace the error (if you want, send your msn mail to jacobo.rodriguez ‘at’ typhoonlabs.com)
Shader Designer doesn’t appear to be parsing the version string properly to handle version 2.0
When I’ve seen the problem before, the software was just checking the digit after the decimal and doing the wrong thing.[/b][/QUOTE]Shader Designer doesn’t check the OpenGL version at all. It checks needed extensions individually. If it finds that multitexture is not supported is because can’t find that extension in the extensions string. So, if multitexture is not in the extensions string, or is getting the MS implementation, or is unable to create a context. In last case, other error than “no multitexute support” should be shown. With our GF6800, Shader Designer works correctly. Infact Shader Designer uses GLEW to access extension, maybe glew does not handle correctly OpenGL 2.0. I’ll check that.
Originally posted by jra101:
Yes, you can do that
By unique I meant you could potentially bind up to 16 totally different textures if you wanted to.
wow, that opens up a lot of interesing possiblilities, i assumed since ive only got 4 texture units i can only bind 4 textures at a time, hmmmm im still not 100% sure how it works, so now i can bind 16 textures and draw them in one pass!, surely something else is going on behind the scenes.
[QUOTE}(function is called glActiveTexture though).[/QUOTE]
been using my own gl wrapper for so long, ive forgotten a lot of the original gl syntax’s
i assumed since ive only got 4 texture units i can only bind 4 textures at a time
If you are referring to the value returned by querying GL_MAX_TEXTURE_UNITS on GeForce FX and GeForce 6 series GPUs, that only applies to fixed function rendering.
When using fragment programs or fragment shaders, you can bind up to GL_MAX_TEXTURE_IMAGE_UNITS_ARB textures which is 16 on GeForce FX and GeForce 6 GPUs.
Originally posted by grisha: Isn’t it strange - changing single constant changes the way textures are sampled?
If you use 5.0 in both operands, compiler optimize code, so it is not just a “single constant changes” but assembler code generated for shaders are different too. But still this doesn’t explane why setting of uniform array values at once doesn’t work…
This did turn out to be a bug in our drivers when using glUniform1ivARB to set an array of samples. This bug has been fixed internally and the fix will be available in a future driver release.
The workaround is to do something like this:
GLint loc = glGetUniformLocationARB(programObject, "u_texMask");
glUniform1iARB(loc, 0);
glUniform1iARB(loc+1, 1);
loc = glGetUniformLocationARB(programObject, "u_texDiffuse");
for (int i = 0; i < 6; i++)
glUniform1iARB(loc+i, i+2);