32 bits bit shifted uints trouble.

Hello, I got a post going on the NVidia boards too, but I was recomended to ask here as well:

I have a transform feedback, and I would wish to store some values inside a 32 bit uint or int, so quite naturally I check up on the specification if GLSL supports bitwise operators, which it does, since GLSL version 3.0 something, which my current NVidia card driver supports.
So I pack my 6 values into the 32 bit int using a geometry shader that streams out to the transform feedback, and some bit shift operators. Nothing funny, right? Well appearantly the card driver crashes EVERY time i try to to stream back 32 bits ints that have been bit shifted to the transform feedback buffer! How is that even possible? The only thing that should be able to happen is that I get another value than i expected when I unpack the int, it should not crash! It is still a 32 bit int, no matter how much I bit shift it.

How am I supposed to be able to do ANYTHING if I cant trust the driver specification? I would rather have a big “UNSUPPORTED” box over all these features NVidia claim to support according to the Khronos specification, than having to run into these brainfucker situiations. It is incredibly frustrating.
Now it could be that I am just a retard, which I have been proved to be on several occasions, and I did something wrong, but I still dont see how transform feedbacking bit shifted ints to a buffer should be able to crash the driver?!

Here is me setting up the shader:

this->terrainShaderData.triangleStreamerShaderId = loadShaderProgram(“src/shaders/triangleStreamer.vert”, “src/shaders/triangleStreamer.geom”, “src/shaders/terrainGenerator.frag”);

glProgramParameteriEXT(this->terrainShaderData.triangleStreamerShaderId,GL_GEOMETRY_INPUT_TYPE_EXT, GL_POINTS);
glProgramParameteriEXT(this->terrainShaderData.triangleStreamerShaderId, GL_GEOMETRY_OUTPUT_TYPE_EXT, GL_POINTS);

int temp;
glGetIntegerv(GL_MAX_GEOMETRY_OUTPUT_VERTICES_EXT,&temp);
glProgramParameteriEXT(this->terrainShaderData.triangleStreamerShaderId,GL_GEOMETRY_VERTICES_OUT_EXT,temp)
;

glBindAttribLocation(this->terrainShaderData.triangleStreamerShaderId, 0, “voxelPosition”);
glBindFragDataLocation(this->terrainShaderData.triangleStreamerShaderId , 0, “fragmentColor”);
glActiveVaryingNV(this->terrainShaderData.triangleStreamerShaderId, “triangleInfo”);
linkShaderProgram(this->terrainShaderData.triangleStreamerShaderId);
glUseProgram( this->terrainShaderData.triangleStreamerShaderId );

Here is me setting up the buffer:

glGenVertexArrays(1, &this->intermediateTriangleStreamVAOB);
glBindVertexArray(this->intermediateTriangleStreamVAOB);

glGenBuffers(1, &this->intermediateTriangleStreamInfos);
glBindBuffer(GL_ARRAY_BUFFER, this->intermediateTriangleStreamInfos);
glBufferData(GL_ARRAY_BUFFER, sizeof(float)5323232, NULL, GL_DYNAMIC_COPY); //GL_STATIC_DRAW);
glVertexAttribPointer(0, 1, GL_INT, false, 0, 0);
glEnableVertexAttribArray(0);

//VAOB input locations
glUseProgram(shader.triangleStreamerShaderId);
int triangleStreamLoc[] =
{
glGetVaryingLocationNV(shader.triangleStreamerShaderId, “triangleInfo”),
};

glTransformFeedbackVaryingsNV(shader.triangleStreamerShaderId, 1, triangleStreamLoc, GL_SEPARATE_ATTRIBS_NV);
this->checkGLErrors();

Here is me starting a transform feedback query to the buffer:

glBindBufferBaseNV(GL_TRANSFORM_FEEDBACK_BUFFER_NV, 0, this->intermediateTriangleStreamInfos);

glBeginTransformFeedbackNV(feedback_prim);
glEnable(GL_RASTERIZER_DISCARD_NV);
glBeginQuery(GL_TRANSFORM_FEEDBACK_PRIMITIVES_WRITTEN_NV, query);
glBindVertexArray(this->voxelVaob);

glDrawArrays(GL_POINTS, 0, (GLsizei)this->voxelCorners.size());

glEndQuery(GL_TRANSFORM_FEEDBACK_PRIMITIVES_WRITTEN_NV);
glDisable(GL_RASTERIZER_DISCARD_NV);
glEndTransformFeedbackNV();

my geometry shader has one out varying, called triangleInfo:
out int triangleInfo;

and we set it here:

triangleInfo = voxelX << 26 | voxelY << 20 | voxelZ << 12 | edge1 << 9 | edge2 << 6 | edge3;

voxelX, Y and Z is clamped within 0-31, edges1, 2 and 3 is clamped within 0, 11

No funny buisiness right? It is just some bit shifts(I’ve done a sanity check bit shifiting constants and they seemed to be able to stream out at least)
Like, am I missing something crucial here?

Im using GeForce 9500 GT, OpenGL 3.3 and GLSL shader 330. Driver version: 8.17.12.5896

Thanks for your time, and I hope you prove me a retard, because it would be really awesome if I indeed could use bit shift operators.

Here is the URL to the thread on NVidia if you would want to check ít out:

http://developer.nvidia.com/forums/index.php?showtopic=5459

This trouble has made a sad panda indeed…