View Full Version : casting types in shaders

02-10-2014, 10:47 AM

Iam running into some problems doing type conversion in a compute shader 4.3

There is no way to do implicit casts as in C++
for example,in C++,
unsigned int a=134217728; // a is 134217728
float b=a; // b is 1.3421773e+008

unsigned int a=1; // a is 1
float b=a; // b is 1.0000000

whereas inside the compute shader, if i have a uint a=1; and I try to convert it into a float using uintBitsToFloat(a) , I get 1.401e-045#DEN which is a weird representation. Whatever does it mean?I would have expected something like 1.0000000
If I try to just do a simple cast as in (float) mynumber and then to verify it , I do (float)mynumber > 0.0f , I do not get into the loop. So i am not sure if the casting by (float) is even working.

What's the best way to cast within shaders?


02-10-2014, 12:39 PM
GLSL does not allow casting. Constructors should be used instead.
For example: instead of (float)a, use float(a).