My orthographic projection matrix causes everything to go to the center of the screen

So my goal is simple. I am trying to get my coordinate space set up, so that the origin is at the bottom left of the screen, and the top right coordinates are (screen.width, screen.height).

Also this is a COMPLETELY 2d engine, so no 3d stuff is needed. I just need those coordinates to work.

Right now I am trying to plot a couple points on the screen. Mostly at places like (0, 0), (width, height), (width / 2, height /2) etc so I can see if things are working right.

Unfortunately right now my efforts to get this going are in vain, and instead of having multiple points I have one in the dead center of the device (obviously they are all overlapping).

So here is my code what exactly am I doing wrong?

Vertex Shader


uniform vec4 color;
uniform float pointSize;
uniform mat4 orthoMatrix;

attribute vec3 position;

varying vec4 outColor;
varying vec3 center;

void main() {
    center = position;
    outColor = color;
    gl_PointSize = pointSize;
    gl_Position = vec4(position, 1) * orthoMatrix;
}

And here is how I make the matrix. I am using GLKit so it is theoretically making the orthographic matrix for me. However If you have a custom function you think would better do this then that is fine! I can use it too.

var width:Int32 = 0
var height:Int32 = 0
var matrix:[GLfloat] = []
func onload()
{
   width = Int32(self.view.bounds.size.width)
   height = Int32(self.view.bounds.size.height)
   glViewport(0, 0, GLsizei(height), GLsizei(width))
   matrix = loadOrthoMatrix(0, width, 0, height, -1024, 1024);
}
- (void)loadOrthoMatrix:(GLfloat *)matrix left:(GLfloat)left right:(GLfloat)right bottom:(GLfloat)bottom top:(GLfloat)top near:(GLfloat)near far:(GLfloat)far;
{
    GLfloat r_l = right - left;
    GLfloat t_b = top - bottom;
    GLfloat f_n = far - near;
    GLfloat tx = - (right + left) / (right - left);
    GLfloat ty = - (top + bottom) / (top - bottom);
    GLfloat tz = - (far + near) / (far - near);

    matrix[0] = 2.0f / r_l;
    matrix[1] = 0.0f;
    matrix[2] = 0.0f;
    matrix[3] = tx;

    matrix[4] = 0.0f;
    matrix[5] = 2.0f / t_b;
    matrix[6] = 0.0f;
    matrix[7] = ty;

    matrix[8] = 0.0f;
    matrix[9] = 0.0f;
    matrix[10] = 2.0f / f_n;
    matrix[11] = tz;

    matrix[12] = 0.0f;
    matrix[13] = 0.0f;
    matrix[14] = 0.0f;
    matrix[15] = 1.0f;
}

Passing it over to the shader

func draw()
{
        //Setting up shader for use
        let loc3 = glGetUniformLocation(program, "orthoMatrix")
        if (loc3 != -1)
        {
            glUniformMatrix4fv(loc3, 1, GLboolean(GL_FALSE), &matrix[0])
        }
        //Passing points and extra data

}

Note: If you remove the multiplication with the matrix in the vertex shader the points show up, however obiously most of them are off screen because of how default OpenGL works.

Also: I have tried using this function rather then glKit’s method. Same results. Perhaps I am not passing there might things into the matrix making function, or maybe im not getting it to the shader properly.

I believe this is because OpenGL is column major.

Quick fix is to change this line:
glUniformMatrix4fv(loc3, 1, GLboolean(GL_TRUE), &matrix[0])

But the “right” way too do this is to reorder you matrix so that each column is continuous i.e. rather than

00, 01, 02, 03
04, 05, 06, 07,
08, 09, 10, 11,
12, 13, 14, 15

do

00, 04, 08, 12
01, 05, 09, 13,
02, 06, 10, 14,
03, 07, 11, 15