I’ve implemented planar shadows and I’d like to calculate the four vertices of the shadow polygon.
I’ve created a shadowMatrix that contains the values that transform object vertices into points on the ground plane. The visual (OpenGL) part of this works as it should.
I need to calculate the shadow’s vertices. I thought that I could do this by multiplying the object’s vertex by the shadowMatrix (isn’t this what OpenGL does?). Unfortunately I’m not getting the correct result, especially for points on the ground plane which should be coincident with the actual vertex that creates them. Here’s what I’ve tried:
Your calculations are only right if there’s no scaling at all. You should divide all your vertex component by shadowMatrix[3] + shadowMatrix[7] + shadowMatrix[11] + shadowMatrix[15]
Note that if you’re using a matrix that was originally destined for OpenGL, you need to transpose it before multiplying vertexes on the right. You could equivalently multiply your vertexes on the left.
Now, two of the points (those in the ground plane) are in the correct locations, however the other two are on the wrong side of the object casting the shadow.
It was mentioned that
if you’re using a matrix that was originally destined for OpenGL, you need to transpose it before multiplying vertexes on the right. You could equivalently multiply your vertexes on the left.
I’m not sure what is meant by vertexes “on the right”.
I suggest you familiarize yourself with basic matrix operations. It will make your life (indeed every one else’s) a lot easier. I hear the math & algos forum is good for questions along these lines. Better still, get yourself a decent book on the subject (google for introductory matrix/linear algebra books)
This stuff is fundamental and without it you’re pretty much up the creek for any non-trivial graphics coding, and you’ll end up relying on the sympathy of others to plod through this remedial math with you.
It’s one thing if the math is incidental to the subject and new ground is covered but this is basic matrix multiplication stuff and without that you don’t have a leg to stand on.
Why can’t I use the matrix that gets passed to OpenGL–after all, the shadow is being drawn correctly. It’s when I try to do the same operation that I get the wrong result. Is this explained anywhere (Red Book?)?
I’m not sure what is meant by vertexes “on the right”–on the right of what?