omdown

10-01-2010, 03:38 PM

Some minor confusion going on that perhaps someone can help me with...

I'm working on some shaders where I need to render objects differently based on their velocity. Right now I'm just doing a little testing of the concept and I'm running into a little bit of trouble. I'll try to give a run down.

(First I'll confess that my top layer is OpenSceneGraph, but given it's just a wrapper over OpenGL I don't *think* it's the issue... I think it's dumb math or a misunderstanding of the shaders on my part.)

Alright so. As I said, I'm rendering objects differently based on their velocity vector. Velocity vector is tracked in the main app. I have a uniform tied to each object which updates the velocity vector each frame. Now, I should be able to transform the velocity vector via the ModelViewMatrix just like I would the normal vector of a vertex (if they were facing the same direction, transforming the vertex should yield the normal and velocity still facing the same direction, and so on.) So in my vertex shader I have:

relativeVelocity = vec4( gl_ModelViewMatrix * vec4(velocity.xyz, 0.) );

I've printed the ModelViewMatrix out from the main application, and when I position the camera at (-100, 0, 0) and look at (1, 0, 0) I get:

[ 0 0 -1 0 ]

[ 0 1 0 0 ]

[ 1 0 0 0 ]

[ 0 0 -100 1 ]

Which since OpenGL stores in column-major, that looks correct (90 degree rotation about Y, -100 X translation).

I'm having a bit of trouble from here. As a test to see if this was working correctly, I thought I'd set up a simple red/green/blue test where an object was just moving straight forward and back and test if the velocity vector was flipping the colors accordingly (Z positive when it was moving toward us, Z negative when moving away from us) because as I understand it, the negative Z axis is how the camera is oriented (after all transformations), so when the velocity vector was pointing Z+ it should be facing us, yes?

So what I was doing in the fragment shader:

if ( relativeVelocity.z > 0. )

gl_FragColor = vec4( 1.0, 0.0, 0.0, 1.0 );

else if ( abs(relativeVelocity.z) < 1e-10 ) // zero, for rounding errors

gl_FragColor = vec4( 0.0, 1.0, 0.0, 1.0 );

else

gl_FragColor = vec4( 0.0, 0.0, 1.0, 1.0 );

However, I'm seeing the opposite of what I expected. So, is my math wrong, my understanding of the orientation of the camera, both, or something else? :S

I'm working on some shaders where I need to render objects differently based on their velocity. Right now I'm just doing a little testing of the concept and I'm running into a little bit of trouble. I'll try to give a run down.

(First I'll confess that my top layer is OpenSceneGraph, but given it's just a wrapper over OpenGL I don't *think* it's the issue... I think it's dumb math or a misunderstanding of the shaders on my part.)

Alright so. As I said, I'm rendering objects differently based on their velocity vector. Velocity vector is tracked in the main app. I have a uniform tied to each object which updates the velocity vector each frame. Now, I should be able to transform the velocity vector via the ModelViewMatrix just like I would the normal vector of a vertex (if they were facing the same direction, transforming the vertex should yield the normal and velocity still facing the same direction, and so on.) So in my vertex shader I have:

relativeVelocity = vec4( gl_ModelViewMatrix * vec4(velocity.xyz, 0.) );

I've printed the ModelViewMatrix out from the main application, and when I position the camera at (-100, 0, 0) and look at (1, 0, 0) I get:

[ 0 0 -1 0 ]

[ 0 1 0 0 ]

[ 1 0 0 0 ]

[ 0 0 -100 1 ]

Which since OpenGL stores in column-major, that looks correct (90 degree rotation about Y, -100 X translation).

I'm having a bit of trouble from here. As a test to see if this was working correctly, I thought I'd set up a simple red/green/blue test where an object was just moving straight forward and back and test if the velocity vector was flipping the colors accordingly (Z positive when it was moving toward us, Z negative when moving away from us) because as I understand it, the negative Z axis is how the camera is oriented (after all transformations), so when the velocity vector was pointing Z+ it should be facing us, yes?

So what I was doing in the fragment shader:

if ( relativeVelocity.z > 0. )

gl_FragColor = vec4( 1.0, 0.0, 0.0, 1.0 );

else if ( abs(relativeVelocity.z) < 1e-10 ) // zero, for rounding errors

gl_FragColor = vec4( 0.0, 1.0, 0.0, 1.0 );

else

gl_FragColor = vec4( 0.0, 0.0, 1.0, 1.0 );

However, I'm seeing the opposite of what I expected. So, is my math wrong, my understanding of the orientation of the camera, both, or something else? :S