I’m attempting to convert some code I’ve got to OpenGL 3.3, core profile, with little success.
Currently, I’m trying to render a 2D grid of triangles. Just that; nothing fancy, no texture-mapping, nothing. I’ve got a vertex shader that does nothing but write a vertex attribute array’s vertices to gl_Position, and a fragment shader that then goes on to write red to gl_FragColor.[1]
The vertex coordinates range from (0,0) to (1,1); rendered properly, they should cover the upper right quarter of the screen.
Problem, though… the screen coordinates, instead of ranging from (-1,-1) to (1,1) as I’d expect, range from approximately (-0.000000000000000000021,-0.000000000000000000021) to (0.000000000000000000021,0.000000000000000000021). (0,0) is still in the center, thankfully.
I’ve established this range by experiment - subtracting/adding Very Tiny Values from the input to gl_Position in the vertex shader. It probably has a meaning of some kind - epsilon times the glViewport parameters? I don’t know.
To be honest, I don’t know where to start debugging this. I’ve been perturbing the program for the last hour, with less than impressive results… the next step would be to try a minimal test-case, but I’m not sure what a minimal test-case /is/, for core 3.3.
So… what’s the smallest amount of code required to put a 2D triangle on-screen, with some trivial shaders involved? (Perspective correction is irrelevant; it’s all 2D).
And do my symptoms look familiar to anyone?
[1] Yes, gl_FragColor is deprecated. I don’t understand the replacement yet, and at least that part works as-is.