Sorry for the naive newbie question, but is it possible to get an array of data into a GLSL Vertex Shader in some form other than as a Sampler2D/Image? I ask because I’d like to be able to move vertices around using luminance information from an input image, but I understand the Sampler2D data-type is only supported in the Vertex Shader on certain hardware.
I’m therefore wondering if it would be possible to create an array of some descripton from, say, the pixels of a single line of an image, and then somehow pass that into a Vertex shader.
I’d ultimately like to be able to generate a line, composed of vertices displaced on, say the, the Y-Axis by the luminance of an input image, and textured with a single line of the input image. I’d then iterate over the same shaders with each line of the image.
This is all very sketchy at the moment, and I have almost no experience of GLSL or OpenGL. I’m really just testing the water at the moment, but wondering if anyone with more experience could give me some hints on whether my plan could be carried out, and on how I might go about achieving the effect I’m after.
Cheers guys,
alx