Transform Feedback

Hi,

I am trying to use Transform Feedback with GL_VERSION (3.2.0 NVIDIA 195.36.31). When Im trying to load function pointer for example:

glGenTransformFeedbacks = (PFNGLGENTRANSFORMFEEDBACKSPROC) glXGetProcAddress ((const GLubyte*)"glGenTransformFeedbacks");

I have a 0 pointer in glGenTransformFeedbacks variable.
What Im doing wrong, or what I forgot to do?

For transform feedback objects, you either need OpenGL 4.0 or GL_ARB_transform_feedback2 support. Driver 195.36.31 is a very old driver that only supports GL3.2 and it predates the transform feedback object specification. Try updating your driver to a more recent version, like the 306 series.

Okay, another question.
How can I find out, is my GTS 250 support this feature or not?

Only NVIDIA’s DX11-class hardware supports ARB_transform_feedback2, so you can’t use feedback objects. You can test these sorts of things by using the OpenGL extensions viewer.

Well, you can install a newer driver and check for the extension, or your can use a utility like the GL extensions viewer ( GLview : OpenGL Extensions Viewer 6 | realtech VR | realtech VR) and check your card in the database (spoiler: I checked for the GTS250 and GL_ARB_transform_feedback2 is not there).

Thanks, Im install new driver and it works, as I can see. But now I have another question.
Ho to devide the output stream from my shader to multiple buffers. I want the transform feedback feature to write vec3 vVerteces variables to one VBO and vec3 vNormals variables to another.

I tried to unite my two VBOs in this form: [vertex normal vertex normal …] but

glVertexPointer (3,GL_FLOAT,3*sizeof(GLfloat),0);
glDrawArrays(GL_LINE_STRIP, 0,(m_Dimension + 1)*(m_Dimension - 1)*2);

use normals as verteces too. Is this code wrong, or I don’t understand what do 3rd param of glVertexPointer, or I make nistake in other place?

[QUOTE=ein_shved;1245080]Thanks, Im install new driver and it works, as I can see. But now I have another question.
Ho to devide the output stream from my shader to multiple buffers. I want the transform feedback feature to write vec3 vVerteces variables to one VBO and vec3 vNormals variables to another.

I tried to unite my two VBOs in this form: [vertex normal vertex normal …] but … use normals as verteces too. Is this code wrong, or I don’t understand what do 3rd param of glVertexPointer…[/QUOTE]

Yeah, the 3rd parameter (stride) is for exactly this case. It tells the driver/GPU how much to skip to get to the next vertex. If you provide 0, it assumes there isn’t anything else in between values for this vertex attribute, and they’re tightly packed one right after another.

If you’re writing vec3 positions and vec3 normals, change it to 6*sizeof(GLfloat).

Ok I already found out. Thanks)

Another question. I have 3 2D buffers, put into texture:

vector<GLfloat> vBuffers (m_Dimension * m_Dimension *3, m_Position.z); //m_Position != 0
m_pPerfectWater->tBindingMutex.lock();
glBindTexture(GL_TEXTURE_2D, idBuffers);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_Dimension,m_Dimension,
		      0, GL_RGB,GL_FLOAT, &vBuffers[0]);

Then Im trying to pass it to my shader programm

lcUniformPvCrNt = glGetUniformLocation(m_ShaderProgramHeights, "bPvCrNt");
if ( lcUniformPvCrNt == -1 ) {
	throw runtime_error ("Can not get location of uniform bPvCrNt");
}
//.........
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, idBuffers);
glUniform1i(lcUniformPvCrNt, 0); 

Program is:

#version 330

out vec3 Vertex;
out vec3 Normal;

layout (location = 0) in vec2 vIndex;
uniform sampler2D bPvCrNt;
uniform float xFactor;
uniform float zFactor;
uniform float omega;
uniform vec3 m_Position;

flat out ivec2 index;

/*
 * x,r - next
 * y,g - current
 * z,b - previous
 */

void calcNormal(void);
void main(void) {

	index = ivec2(vIndex);
	Vertex.y = texture (bPvCrNt, index).g;
	Vertex.x = m_Position.x + xFactor*vIndex.x;
	Vertex.z = m_Position.z + zFactor*vIndex.y;

	gl_Position = vec4(Vertex,1);
	calcNormal();
}

but all Vertex.y coordinate has 0 value. What I’m doing wrong?

Try creating your texture with a format of GL_RGB32F, not just GL_RGB. The latter probably defaults the component format to 8-bit unsigned (i.e. GL_RGB8).

Eah Im find the problem! Thanks))