I have worked with C++/OpenSceneGraph/GLSL integration and I need to handle dynamic array on shader.
My dynamic data array of vec3 was converted into 1D-texture to pass as uniform to fragment (I’m using GLSL 1.3), as follows:
osg::ref_ptr<osg::Image> image = new osg::Image;
image->setImage(myVec3Array->size(), 1, 1, GL_RGBA8, GL_RGBA, GL_FLOAT, (unsigned char*) &myVec3Array[0], osg::Image::NO_DELETE);
// Pass the texture to GLSL as uniform
osg::StateSet* ss = scene->getOrCreateStateSet();
ss->addUniform( new osg::Uniform("vertexMap", texture) );
For now, I would like to retrieve my raw array of vec3 on fragment shader. How can I do this process? Does a texture2D function only return normalized values?
If you want to store floating-point values directly, the texture needs to have a floating-point internal format (e.g. GL_RGB32F). Floating-point textures require OpenGL 3.0 (which you should already have if you’re using GLSL 1.3) or the ARB_texture_float extension.
If you don’t need the full range and precision of IEEE-754 single-precision floats, you can use half-precision floats (e.g. GL_RGB16F), which have a 5-bit exponent and 10-bit significand.
If you want constant absolute error (rather than relative error), then you can use a normalised format or an integer format and perform the scaling in the fragment shader.
I will can handle the floating-point textures on C++/OSG side, however I still have a question: how can I access this floating-point values on shader side? Could you give me an example?
I’m facing with texture coordinate problem. I passed the floating matrix of vec3 as texture to shader, however I would like to retrieve these values there . Could you help me?
If you’re using a texture as an array, you should probably use texelFetch() to retrieve values. That avoids filtering and the need to convert indices to normalised texture coordinates.