I seem to have come across an extremely odd problem in GLSL. I’ve managed to simplify the problem to multiplying 2 textures together. Here are some screenshots to help explain (note the gl_FragColor line):
Both texture1 and texture2 are fine by themselves, but multiplying them together, they go completely transparent (and it’s nothing todo with the alpha component, as the original error occurred when doing texture1.rgb * texture2.rgb, with the alpha component set to 1.0).
Does anyone have any ideas why this is occurring? I’m running a GF6800GT, Driver version 91.47 (altho I’ve seen it on earlier versions), Windows 2000.
I originally asked the question over at GameDev , tried a few things, but nothing seemed to worked.
I actually had an issue with a quadro fx4500 where a simple ARBfp program (texture fetch and output, maybe one other instruction) was somehow enabling what could only be described as alpha testing…despite explicity disabling anything that could result in a fragment being killed immediately prior to the draw call. Setting alpha output to any value greater than zero in the fragment program caused the geometry to appear, anything else caused nothing to be rendered. This seemed to involve a particular texture/texture unit fetch, as it worked when sampling from a different texture unit. I don’t remember the driver number but it was pretty recent, maybe the same as elFarto.
I tried Korval’s suggestion, but to no avail. It’s definitely something todo with texture1 (which comes from the cubemap) as any operation (-, +, / or *) that involves texture1, causes the problem. The odd thing is, texture1 looks fine by itself.
The outline in the final screenshot is from a different render call.
Here is the assembler output from the shader in the final screenshot, dumped using NVemulate:
Originally posted by Komat: Do you set correct values to the sampler uniforms (the are set to zero initially)?
Yes, the texture shows fine by itself (see the first image in my inital post), it’s a bit hard to see though.
Originally posted by elFarto: Yes, the texture shows fine by itself (see the first image in my inital post), it’s a bit hard to see though.
Since all samplers are initially set to zero and one texture unit can have simultaneously bound several textures of different types (2d, cube, 3d,…), it might work with one used texture even if the samplers were not set correctly however it is error if one texture unit is addressed trough different texture types (2d and cube) from single shader and offending rendering will fail with INVALID_OPERATION error.
Ok, thanks for all your help. I have discovered the problem, and it’s completely unrelated to the texture unit.
The program I’m working on is to display the models from a game (EVE Online). Todo this, it extracts the shader information from the game’s files and converts it from DirectX texture stage operations into a GLSL shader.
The shader I posted above isn’t the shader that it tries to compile initially, as my code was generating an invalid shader (something like foo = vec4(1.0, 1.0)). I stripped out all the (I thought) irrelevant lines. After I fixed my application, the problem went away.
Thanks for all your help, even if the problem was nothing todo with OpenGL/drivers.