PDA

View Full Version : Bitwise operations on GPUs



dimensionX
02-08-2005, 09:13 AM
Has anyone tried to emulate bitwise operation on the GPU ?

I know that GLSL and Cg do not support bitwise operations. So how to get around this problem ?

Can we use the preprocessor because the preprocessor expressions do allow bitwise operators ? Has anyone tired this ?

Thanks!

daydreamer
02-10-2005, 03:14 PM
I'm sure I saw an implementation of John Conway's game of life running using Cg or OGSL.

But if you were wanting to implement bitwise operations, you could use the following:

AND = v1 * v2
OR = min( v1+v2, 1 )
XOR = v1*(1-v2) +v2*(1-v1)

NOT = 1 - v1
NAND = 1 - v1*v2
NOR = 1 - min(v1+v2,1)
XNOR = 1 - (1-v2)-v2*(1-v1)

Assuming v1 and v2 are in the range 0.0 to 1.0
But it's a bit wasteful using 8-bits to represent 1 bit.