View Full Version : Question about passing arbitrary data to the graphics card

CJ Clark
02-28-2006, 01:56 PM
Hello, I just joined and am posting because I'm in a bit of a bind. I have a set of data that I've been passing to my graphics card via glInterleavedArrays and using the GL_T4F_C4F_N3F_V4F parameter to pass as much data as possible. I'm using it for skinning. Using this parameter allows me 2 texture coords, 3 biNormals, 3 normals, 3 vertices and then 4 floats remaining, which I use for 2 bones and 2 weights. The problem is, I need up to 7 bones.

My solution's appear to be either finding out how to compress the bone data and the decompress it in my shader (via CG) or, figure out if I can pass an arbitrary structure to the graphics card and then grab the data as it comes in the shader. What would be really nice is if I can do something like glInterleavedArrays(sizeof(MyStruct), 0, pointer) or something. What I've found so far is glVertexAttribPointerARB(), however, I really have no idea how to use it and haven't found enough sample code to make any sense of it. If that's the right function to use, how do I use it? I know the obvious stuff like what the parameters do, but I don't understand all of that enough to do anything with it.

Anyway, if anyone knows how to pass a structure that I can define arbitrarily (anywhere from 12 bytes to 80 bytes for example, where 80 would let me hold all my data) I would REALLY appreciate the help. Details help as well if anyone knows them. Thank you for your time and I hope to hear back from one of you!

02-28-2006, 02:10 PM
Don't use glInterlevedArrays.

It's just not worthwhile. You can do the interleving manually by defining proper offsets for the attribute pointers when you bind them. Plus, you're no longer restricted to the formats exposed by glInterlevedArrays.

glInterlevedArrays are one of those APIs that the ARB thought was a good idea at some point, but was never actually necessary.

02-28-2006, 11:33 PM
glVertexAttribPointer is for GLSL.

You can define an arbitrary "attribute" variable and pass data to the vertex shader with glVertexAttribPointer. For that, get the index of the attribute using the glGetAttribLocation and pass it as the first parameter of glVertexAttribPointer. The other parameters are like the other vertex arrays.

I'm sure there is a similar functionality around for CG.

Alternatively you can pass arbitrary data in the texture coordinates, afaik there are at least 8 of them on recend graphic boards. You can query the exact number using glGetInteger(GL_MAX_TEXTURE_COORDS).

Switch to a different texture coordinate set using glClientActiveTexture and specify the vertex array using glTexCoordPointer...

CJ Clark
03-01-2006, 11:20 AM
Thank you both for the tips. I'll remove the use of glInterlevedArrays for this application and move back to the original implementation of pointers to each individually. Thank you also for the tip on glVertexAttribPointer, I'll look around to see if it mentions it being for GLSL so I can pay attention to that in the future. I'll also check to see if there is similar functionality for CG and look into multiple texture coordinates to hold the rest of my data. Thanks, I appreciate the help!

03-01-2006, 12:58 PM
hmmm...a programmer from redmond, eh? :eek:

CJ Clark
03-01-2006, 04:36 PM
Yup. One of the many.

Munich rocks BTW, good beer. :)

03-01-2006, 10:12 PM

Munich Rocks and Cologne hypes.


03-01-2006, 10:26 PM