View Full Version : gluOrtho2D and OpenGL 3.1 newbie problem

12-05-2009, 06:17 PM
I'm working on converting my little OpenGL game from 1.* to 3.1, and it has mostly been going fine, except for the fact that gluOrtho2D/glOrtho seem to have no effect on the coordinates.

I'm trying to use the line "gluOrtho2D(-10.0, 10.0, -10.0, 10.0);", and I've tried putting it in all kinds of places in the code, just to make sure it wasn't in the wrong place. But the coordinate system is still -1 to 1. I've tried calling gluOrtho2D before setting shaders, before/after glBindBuffer etc, and before glBindVertexArray/glDrawArrays, and it makes no difference.

Obviously I am an OpenGL newbie and I've only been using 3.1 for about a day, so I'm probably missing something very basic.

I'd post the code, but it's not very compact, so I don't know if I should. So, my question is, does anyone know where the gluOrtho2D line is meant to go in a 3.1/3.2 application that's trying to avoid using deprecated functions?

Alfonse Reinheart
12-05-2009, 06:32 PM
Are your shaders actually using the standard matrices? If not, then gluOrtho (and any other matrix functions) will do nothing for you.

12-05-2009, 07:25 PM
How can I make sure they're using the standard matrices, beyond not setting glOrtho until after I've loaded the shaders? It sounds like this might be my problem. Ordinarily I would google this stuff, but I have a hard time finding 3.* tutorials.

12-05-2009, 09:26 PM
I assume I should be setting glOrtho either right before the following:

glVertexAttribPointer((GLuint)0, 2, GL_FLOAT, GL_FALSE, 0, 0);

Or before the following:

glDrawArrays(GL_TRIANGLES, 0, 3);?

It doesn't work though, no matter where I place it. Should I just be scaling the vertex array positions by the values I want, before passing them to the shader, perhaps? That doesn't sound like a great way of handling this, but I don't even know if it's possible to get glOrhto working with shaders.

12-05-2009, 10:04 PM
Okay, I figured out I have to do it in the shader, something like:
gl_Position = gl_ProjectionMatrix * vec4(in_Position, 1.0);

Unfortunately, this gives me a really weird result, but I guess I shall keep experimenting. :)

12-05-2009, 10:15 PM
Alright, I solved it. I apologize for the thread, it was a pretty dumb newbie mistake. Thanks!