Drivers from NVIDIA have traditional displayed a warning when int literals are used where a float is called for. However, I’ve seen at least one new driver (NVIDIA’s drivers for Mac OS X 10.6) that now flags these as errors, not warnings. For example:
float x = 5; // Error, but used to be a warning
float y = 5.0; // OK
I’m having trouble figuring out why this is an error. The GLSL spec states pretty clearly than an implementation isn’t even required to actually support ints. It also states that implicit conversion between types is done when necessary, and specifically cites conversion from int to float (4.1.10).
So my real question is, does anyone know an easy way to disable this error, or an easy way to convert hundreds of int literals in dozens of files to float literals?