PDA

View Full Version : Drawing a textured screen-sized triangle



_jkot_
03-17-2014, 05:06 AM
Hi,

I'm trying to draw an array of pixel colors to the screen. It seems that the best way to do this is to draw a screen sized (covers the entire screen) triangle and apply to it a texture that stores the color values (if there is a better way please tell me). Drawing the triangle with a single color works, but when I try to use the texture I just get a black screen. I don't know whether the problem is with the shader or some other part of the code.

Here is the fragment shader where I try to use the texture:

#version 330

out vec3 outputColor;

uniform sampler2D myTextureSampler;

void main(){

vec2 texCoord;
texCoord.x = gl_FragCoord.x/1280;
texCoord.y = gl_FragCoord.y/960;

outputColor = texture(myTextureSampler, texCoord).rgb;

}
How I think this should work: Because texture coordinates have range [0,1] I have to divide the gl_FragCoord which is in window space by window size (1280*960 in this case) to get the right texture coordinates (the texture is also 1280*960).

This is where I copy the pixel color values to the texture:

textures.push_back(0);
glGenTextures(1, &textures[0]);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, textures[0]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_FLOAT, &pixels[0]);
glBindTexture(GL_TEXTURE_2D, 0);

And here is the rendering part:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glUseProgram(shaderPrograms[0].program);
glBindTexture(GL_TEXTURE_2D, textures[0]);

glBindVertexArray(vaos[0]);
glDrawArrays(GL_TRIANGLES, 0, 3);
glBindVertexArray(0);

glBindTexture(GL_TEXTURE_2D, 0);
glUseProgram(0);

window.display();

Pixels is std::vector<Vector3> (Vector3 has members float x,y,z) and its size is 1280*960, which is also the size of the window. I know pixels contains the right values because when I save them as .bmp I get the image I want. I'm using SFML for creating the OpenGL window. OpenGL version is 4.2.

Dan Bartlett
03-17-2014, 10:45 AM
By default OpenGL uses mipmaps for most texture types (except for rectangle textures (https://www.opengl.org/wiki/Rectangle_Texture)), so it is expecting you to provide them & since you don't your texture isn't complete. One way to tell OpenGL that you don't want to use mipmaps is to include:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

In my opinion using mipmaps shouldn't have been the default value since it's an additional obstacle to learning OpenGL, and provides some complexity to the rules for what the initial parameters will be when the newly generated texture is first bound to a particular texture target, but it's late for it to change now.

_jkot_
03-17-2014, 11:50 AM
Thank you very much Dan, now it works correctly!