PDA

View Full Version : Textures and glsl



Gabba123XXL
11-03-2011, 01:21 PM
Hello,
I try to use a simple 2D texture with OpenGL 3.1 and by using a shader program. I tried to map on a sphere but it does not work. The Sphere is black.

My steps were.

1) To initialize the texture


glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_2D, tex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGB,
128,
128,
0,
GL_RGB,
GL_UNSIGNED_BYTE,
pointer);


2) Using a shader program with input attribute for the texels.


// vertex shader
#version 140

in vec3 MCVertex;
in vec3 MCNormal;
in vec2 TexCoord0;
...

// fragment shader
#version 140

in float LightIntensity;
in vec2 TexCoord;

out vec4 FragColor;

uniform sampler2D SampleTexture;

void main()
{
vec3 lightColor = vec3(texture(SampleTexture, TexCoord));
FragColor = vec4(lightColor * LightIntensity, 1.0);
}


3) before I start to render the geometry, I bound the texture


glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _id);


What I have forgotten? Or maybe the order of the commands are wrong?

Thanks in advance!

Alfonse Reinheart
11-03-2011, 02:20 PM
Have you tried texturing something simpler first, like a quad? That way, we can rule out whether it's the texture coordinate or the texture itself.

Gabba123XXL
11-03-2011, 02:25 PM
Sorry no! But I could try. In OpenGL there are no Quads!

Alfonse Reinheart
11-03-2011, 02:30 PM
In OpenGL there are no Quads!

There are no spheres either, but you managed to render one of those, didn't you? Just make a mesh that's a quad instead of a sphere.

Gabba123XXL
11-03-2011, 02:56 PM
So far I used simple boxes and generated texels.
I used triangle strip to draw to draw each face of the box. (Quads are depricated.)

Gabba123XXL
11-03-2011, 03:04 PM
Sorry... I did not say... the result is the same... it is black...

The texel coordinates are 2D and in the interval [0,1]

Gabba123XXL
11-04-2011, 01:06 AM
For example I used for the front face of the box the following vertices and texels:

vertices: (-1,1,1),(-1,-1,1),(1,-1,1),(1,1,1)
texels: (0,1), (0,0), (1,0), (1,1)

The imagedata is rgb and the values are in the range [0,255].

BionicBytes
11-04-2011, 02:33 AM
There are two obvious reasons why the output may be black (I'm not saying this is the cause, however).



// fragment shader
#version 140

in float LightIntensity;
in vec2 TexCoord;

out vec4 FragColor;

uniform sampler2D SampleTexture;

void main()
{
vec3 lightColor = vec3(texture(SampleTexture, TexCoord));
FragColor = vec4(lightColor * LightIntensity, 1.0);


First, remove the "LightIntensity" uniform from the shader and just output the texture colour.
eg, FragColor = vec4(lightColor, 1.0);

If that fails too, emit a Red colour eg vec4 (1.0);

Secondly, what have you sent to GL as the uniform value to link the ActiveTexture Unit to the shader's uniform "SampleTexture"?
In other words, have you set up the uniform parameter containing the value 0 and sent it to Opengl using glUniform1i (locationofUniform, 0)? How did you determine the location of the uniform "SampleTexture" - did you query for it, or set it explicitely?

Gabba123XXL
11-04-2011, 02:47 AM
Hello,
the LightIntensity is not the problem, for sure... I had not listed the complete code of the vertex shader, there is a LightIntensity as an output variable.

So far... I assume there could be a configuration problem. To the question about "SampleTexture";
I determined the location of the uniform variable "SampleTexture" and then I used "glUniform1iv(texLoc, 1, data)". The data is GLint and has one element.

Ahh... How I have to handle the texels input into attribute "TexCoord0"? I did like with the vertices in a buffer object and with STATIC_DRAW.

BionicBytes
11-04-2011, 03:42 AM
LightIntensity is not the problem, for sure... I had not listed the complete code of the vertex shader, there is a LightIntensity as an output variable.

I realize there is an output variable, but I wanted you to eliminate the possibility that it has a value other than 0.
Perhaps emit vec4(LightIntensity) as FragData to test that part of the code.
There is only two possible ways FragData is emitting black
1. The lightIntensity value is 0
2. The Texture is black, or not bound correctly to the shader.

These are the reasons why you need to eliminate each possibility one at a time. If all fails emit a solid red colour to check you actually getting some valid output.

Gabba123XXL
11-04-2011, 04:03 AM
I built a check image of 128x128. Black and white. The data which I bound is totally 128*128*3 bytes.

The data is arranged like row major data. But I don't think it is a problem with the image, I changed already to completely white but the result was still black.

I will check later the possibility if the shader produces the black fragments.

I guess, maybe I made a mistake in the initialization... Do you have any ideas, what could be wrong?

Gabba123XXL
11-04-2011, 08:10 AM
Now I tested the LightIntensity... it was not the reason.
I think it is a problem with configuration.

BionicBytes
11-04-2011, 08:58 AM
yo'll have to post code showing how you obtain the uniform location and how you send the uniform values.
eg your shader compilation and object rendering code.

Gabba123XXL
11-04-2011, 10:45 AM
This is image creation:


for (unsigned int j=0; j<imgSize; ++j) {
for (unsigned int i=0; i<imgSize; ++i) {
if ( ((i+j) % tfs) < fs) {
checkImage[i*3 + j*imgSize*3] = (GLubyte) 0;
checkImage[(i+1)*3 + j*imgSize*3] = (GLubyte) 0;
checkImage[(i+2)*3 + j*imgSize*3] = (GLubyte) 0;
} else {
checkImage[i*3 + j*imgSize*3] = (GLubyte) 255;
checkImage[(i+1)*3 + j*imgSize*3] = (GLubyte) 255;
checkImage[(i+2)*3 + j*imgSize*3] = (GLubyte) 255;
}
}
}


After that: Creation of the texture:


glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, &amp;_id);
glBindTexture(GL_TEXTURE_2D, _id);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGB,
128,
128,
0,
GL_RGB,
GL_UNSIGNED_BYTE,
checkImage);
glBindTexture(GL_TEXTURE_2D ,0);


Now rendering: The shader program is in use


glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _id);

int texunit = 0;
GLint loc = glGetUniformLocation(program, "SampleTexture");
glUniform1iv(loc, uniform._count, (GLint*) &amp;texunit);

// render


Here the details around the texels for the attribute...


glGenBuffers(1, &amp;_buffers);
glBindBuffer(GL_ARRAY_BUFFER, _buffers);
GLfloat *texelBuffer = new GLfloat[texels.size()];
std::copy(texels.begin(), texels.end(), texelBuffer);
glBufferData(GL_ARRAY_BUFFER, texels.size()*sizeof(GLfloat), texelBuffer, GL_STATIC_DRAW);
glVertexAttribPointer((GLuint)3, 2, GL_FLOAT, GL_FALSE, 0, 0);

// At rendering time
glBindVertexArray(_vao);
...
glEnableVertexAttribArray(3);
glDrawElements(...)


That's it! Thanks a lot for your help.

Gabba123XXL
11-04-2011, 11:54 AM
so any idea???

Alfonse Reinheart
11-04-2011, 12:36 PM
glUniform1iv(loc, uniform._count, (GLint*) &amp;texunit);

Several things.

1: Why are you uploading that as an array? It's not a sampler array; it's just a single sampler. Use `glUniform1i` instead.

2: Most important of all, did you call `glUseProgram(program)` before this? If not, then this call should generate a GL error (unless some other program was bound, in which case you likely did something really bad).


so any idea???

You waited a whole hour to bump the thread.

Gabba123XXL
11-04-2011, 01:11 PM
First it doesn't matter if I load one single element to the uniform or an array of size one. Why I do it, has some decisions around my project which I am developing. Even I changed it in the case of one element using the simpler function 'glUniform1i' but the result is the same I have black geometries!


Most important of all, did you call `glUseProgram(program)` before this? If not, then this call should generate a GL error (unless some other program was bound, in which case you likely did something really bad).

Of course!!! If I wouldn't switch on the program then my whole scene would be black. I made a whole simple scene and T am using more than one shader program (by the way). The geometries which are affected by the texture shader are black, all others are rendered fine.

So far, did you see something what is wrong or in wrong order? What's about the glActiveTexture? Do I need 'glEnable(GL_TEXTURE)'?

Gabba123XXL
11-04-2011, 03:19 PM
OK!
I fixed the problem. It was the "InternalFormat" and Format of "glTexImage". Both use the same flags but the types are different.
InternalFormat -> GLint
Format -> GLenum
My mistake was I sent nothing to internalFormat because my storage managment was a little bit confused!

Thank you for your help!

Dark Photon
11-05-2011, 02:06 PM
It was the "InternalFormat" and Format of "glTexImage". Both use the same flags but the types are different.
No, they don't accept all the same flags (enumeration defines), but for legacy reasons there are a few enumeration defines that can be provided to both.

Format only specifies which components you're providing (if you're even providing data) for instance, GL_RED, GL_RGB, GL_RGBA, etc.

Whereas internal format typically specifies a precise internal format (including type and number of components and bits per component), for instance GL_R16F, GL_RGB8, GL_RGB10_A2, etc.

What sometimes is confusing is that you can provide "generic" formats such as GL_RGB to internal format and it tells OpenGL to just pick a specific internal format that it thinks is good enough for you.

Also regarding the types: In practice, internal format is a GLenum. However, I think the history on internal format being a GLint rather than a GLenum is that in days long ago you could provide 1, 2, 3, and 4 as internal formats and, again, that told OpenGL to just pick some format that it thinks may be good enough for you. Generally speaking, you should tell it precisely the one you want, with an specific internal format enum symbol (e.g. GL_RGB8).

Gabba123XXL
11-07-2011, 01:57 PM
thank you... this helped me very much!