OpenGL Row-Major mode?

Yes, I do understand that OpenGL uses column-major matricies for their transformations. (Ok, yes, I know, not ‘technically’ Column major, due to storage and everything… but I do believe you understand what I am getting at).

I heard a rumor, at my college, that OpenGL has an option to toggle using a “Row-Major” mode for transformations, instead of Column Major.

I have searched the internet to no avail, as of yet, to see if it is true. I personally prefer using Row-Major transformations, simply because it makes more logical sense to me to execute the transformations in an in-order fashion (as opposed to what I’ve used in OpenGL so far, which involves putting my first transformation at the very bottom, just before beginning a primitive, and then putting the last transformation that affects them at the very top of the transformation list… which feels very reverse-order).

Thankfully, my current project handles it’s matricies outside of OpenGL, only to use something like glMultMatrix to import them when things need to be drawn. They are row matricies, as is. So, if there is no “row-major” mode, would I merely be able to call glMultMatrix on the transpose of my row-major matrix, and have it work as expected?

Sorry, you guys probably get this question a lot.

-Serge

http://www.opengl.org/sdk/docs/man/xhtml/glUniform.xml

void glUniformMatrix4x2fv(GLint location, GLsizei count, GLboolean transpose, const GLfloat * value);

P.S. or just do
gl_Position = gl_Vertex * mvp;
instead of
gl_Position = mvp * gl_Vertex;

(and have your uploaded data be in row-major, without transpose=true)

The difference in “row major” and “column major” is only in what order the numbers are stored in memory. It as no effects on matrix mathematics. That means that the only situation where you have to care about the order is when you transfer data between different systems, or if you design your own matrix manipulation functions.

Also, consider not using the deprecated fixed function pipeline and use shaders instead. glMultMatrix() is a deprecated function.

Maybe have a look at glm, a highly optimized matrix library, specially adapted for use with OpenGL.

Alternatively, and if you need to stick with the old calls, use glMultTransposeMatrix: http://www.opengl.org/sdk/docs/man/xhtml/glMultTransposeMatrix.xml

Another approach which avoids the transpose (if you do your own matrix math):

Can you point me in the direction of “Shaders”?
I’ve worked in HLSL once, in the past; However, I have never had the chance to learn OpenGL shaders.

Any good guides?

[QUOTE=IAmSerge;1238444]Can you point me in the direction of “Shaders”?
I’ve worked in HLSL once, in the past; However, I have never had the chance to learn OpenGL shaders.

Any good guides?[/QUOTE]
I would recommend: http://www.arcsynthesis.org/gltut/Basics/Tut01%20Following%20the%20Data.html
Maybe you want to read that tutorial from the beginning.