So slowly but surely i’m running out of ideas. I developed an application to view DTI datasets using 3d textures. Since i developed on my 8800GTX i took certain things for granted which later turned out to be nvidia- or at least dx10 specific. Now i tried to get my application running on ATI hardware again. This time it’s a laptop with a mobility radeon HD3470, which I think is fairly recent.
I narrowed the problem down to this line:
col1 = texture3D(tex, gl_TexCoord[0].xyz).rgb;
The shader compiler complained here, that the dot operator isn’t available for array access, so i changed it into
Which compiles without warning but now simply crashes the application hard. I’m running out of ideas. What am I doing wrong here or is it simply so that texture3d doesn’t work on these cards. That would help me too, if I can stop trying and tell my customers to get a nvidia card.
Perhaps it is an precision qualifier problem. Had problems with that in the past, no compiler warnings, but hard crashes. See this topic for more details:
Ok now every variable has a precision qualifier, which they shouldn’t need as how brolingstanz pointed out in that other thread it’s a compatibility thing for ES, but I’m at that point where I try everything … and … it still crashes.
Did you also use precision qualifiers in the vertex shader for out variables you use as in variables in the fragment shader (that was the problem in my case)?
It is just a hunch, no idea otherwise what could cause the problem.
I meant warnings, but only because I had similar problems when trying to get application to work on ATi. In that case it was something as simple as using texture1D instead of tex1D. I don’t even think the compiler warning showed this but I was just curious.
I’ve not had a problem with texture3d on ATI, and I use them extensively in some fairly complex shaders. Most of mine have a vec3() inside them wrapped around various sets of values to generate the texcoords.
So just some other suggestions…
Try wrapping a vec3() around your texcoord.xyz, or creating it as a vec3 and sticking it in there without the .xyz
Also try getting a vec4 or a vec3 result from it without the .rgb extension.
If that does not work then perhaps try downloading Gremedy’s gDEBugger (you get a 7 day trial) and seeing what that has to say about your shader / textures.