PDA

View Full Version : Shading in OpenGL



someoney
02-18-2011, 01:37 PM
So, after borrowing the shader code from http://www.cse.ohio-state.edu/~kerwin/refraction.html (I asked for permission already), I attempted to implement it via OpenGL.

Bam! I hit snag 1. I cannot figure out how to get OpenGL to render correctly. I can use the shader code in rendermonkey just fine, which tells me that the problem is at my end. I'm probably passing the textures incorrectly.

my init:


GL2 gl = drawable.getGL().getGL2();
gl.glShadeModel(GL2.GL_SMOOTH);
gl.glEnable(GL2.GL_DEPTH_TEST);

gl.glClearColor(0, 0, 0, 1);

glu = new GLU();
glut = new GLUT();

textures = new Texture[6];
TextureData[] data = new TextureData[6];
try {
for (int i = 0; i < textures.length; i++) {
data[i] = TextureIO.newTextureData(drawable.getGLProfile(), new URL(base + textnames[i]), false, "jpg");
textures[i] = TextureIO.newTexture(data[i]);
}

int[] temp = new int[1];
gl.glGenTextures(1, temp, 0);
texname = temp[0];
gl.glBindTexture(GL2.GL_TEXTURE_CUBE_MAP, texname);

gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_WRAP_S, GL2.GL_REPEAT);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_WRAP_T, GL2.GL_REPEAT);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_WRAP_R, GL2.GL_REPEAT);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_MAG_FILTER, GL2.GL_NEAREST);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_MIN_FILTER, GL2.GL_NEAREST);

int[] type = { GL2.GL_TEXTURE_CUBE_MAP_POSITIVE_X, GL2.GL_TEXTURE_CUBE_MAP_NEGATIVE_X,
GL2.GL_TEXTURE_CUBE_MAP_POSITIVE_Y, GL2.GL_TEXTURE_CUBE_MAP_NEGATIVE_Y,
GL2.GL_TEXTURE_CUBE_MAP_POSITIVE_Z, GL2.GL_TEXTURE_CUBE_MAP_NEGATIVE_Z
};

for (int i = 0; i < data.length; i++)
gl.glTexImage2D(type[i], 0, data[i].getInternalFormat(), data[i].getWidth(),
data[i].getHeight(), 0, data[i].getPixelFormat(), data[i].getPixelType(),
data[i].getBuffer());

gl.glTexGeni(GL2.GL_S, GL2.GL_TEXTURE_GEN_MODE, GL2.GL_NORMAL_MAP);
gl.glTexGeni(GL2.GL_T, GL2.GL_TEXTURE_GEN_MODE, GL2.GL_NORMAL_MAP);
gl.glTexGeni(GL2.GL_R, GL2.GL_TEXTURE_GEN_MODE, GL2.GL_NORMAL_MAP);

gl.glTexEnvf(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_MODULATE);
gl.glGenerateMipmap(GL2.GL_TEXTURE_CUBE_MAP);

} catch (IOException exc) {
textures = null;
exc.printStackTrace();
}

int v = gl.glCreateShader(GL2.GL_VERTEX_SHADER);
int f = gl.glCreateShader(GL2.GL_FRAGMENT_SHADER);
try {
TextureData text = TextureIO.newTextureData(drawable.getGLProfile(), new File("glassbump.png"), false, "png");
bumpMap = TextureIO.newTexture(text);
BufferedReader brv = new BufferedReader(new FileReader("water.vert.txt"));
String vsrc = "";
String line;
while ((line=brv.readLine()) != null) {
vsrc += line + "\n";
}
String[] src = new String[1];
src[0] = vsrc;
gl.glShaderSource(v, 1, src, null);
gl.glCompileShader(v);

BufferedReader brf = new BufferedReader(new FileReader("regular_water.frag.txt"));
String fsrc = "";
while ((line=brf.readLine()) != null) {
fsrc += line + "\n";
}
src = new String[1];
src[0] = fsrc;
gl.glShaderSource(f, 1, src, null);
gl.glCompileShader(f);
checkLogInfo(gl, shaderprogram);

shaderprogram = gl.glCreateProgram();
gl.glAttachShader(shaderprogram, v);
gl.glAttachShader(shaderprogram, f);
gl.glLinkProgram(shaderprogram);

} catch (Exception e) {

System.out.println(e);
e.printStackTrace();

}
//enviro = new Cube(textures);

gl.glUseProgram(shaderprogram);
gl.glActiveTexture(GL2.GL_TEXTURE0);
int loc = gl.glGetUniformLocation(shaderprogram, "Texture");
gl.glBindTexture(GL2.GL_TEXTURE_2D, bumpMap.getTextureObject());
gl.glUniform1i(loc, 0);
gl.glActiveTexture(GL2.GL_TEXTURE1);
gl.glBindTexture(GL2.GL_TEXTURE_2D, textures[0].getTextureObject());
loc = gl.glGetUniformLocation(shaderprogram, "Environment");
gl.glUniform1i(loc, 1);
gl.glUniform(new GLUniformData("refraction_index", (float)1.4));

gl.glValidateProgram(shaderprogram);
checkLogInfo(gl, shaderprogram);
gl.glUseProgram(shaderprogram);


My drawing func:


GL2 gl = drawable.getGL().getGL2();

gl.glMatrixMode(GL2.GL_PROJECTION);
gl.glLoadIdentity();
glu.gluPerspective(90, width/height, near, far);
glu.gluLookAt(object_pos*java.lang.Math.sin(theta) , 0, object_pos*java.lang.Math.cos(theta), 0, 0, 0, 0, 1, 0);
gl.glMatrixMode(GL2.GL_MODELVIEW);

gl.glClear(GL2.GL_COLOR_BUFFER_BIT | GL2.GL_DEPTH_BUFFER_BIT);

glut.glutSolidSphere(1, 50, 50);

gl.glFlush();
drawable.swapBuffers();
gl.glFinish();


The Environment variable is of type samplerCube and the Texture variable is of type sampler2D.

What I have done so far:
1) I tested the textures: that is I checked if the textures were read in to opengl correctly by texturing some simple objects. The textures were fine.
2) I checked if the files were read in correctly by modifying the frag shader to auto assign the pixel to red. It worked. So the shaders were read in and validated fine.

I'm at an impasse. I'm fairly certain that I either initialized the shader incorrectly or mapped the texture incorrectly or something to that effect. I just can't seem to figure that quite out.

Also, I'm fairly confident that to return to the fixed pipeline, I specify gl.glUseProgram(0). But that still disables texturing for objects even when I try gl.glEnable(GL2.GL_TEXTURE_2D) again!

Is there a step I'm missing?

I'm sorry for the long post, and would be grateful for any help you may be able to provide.

carsten neumann
02-18-2011, 01:54 PM
gl.glShaderSource(f, 1, src, null);
gl.glCompileShader(f);

checkLogInfo(gl, shaderprogram);
shaderprogram = gl.glCreateProgram();


you check the info log for a program object you've not created yet...



The Environment variable is of type samplerCube


yet you bind a GL_TEXTURE_2D to texture unit 1, which is the one that samplerCube reads from...

someoney
02-18-2011, 02:33 PM
Opps, I must have reordered the code while I was pulling my hairs out. What the heck was I thinking~?

I revised the code to include the corrections you mentioned:



GL2 gl = drawable.getGL().getGL2();
gl.glShadeModel(GL2.GL_SMOOTH);
gl.glEnable(GL2.GL_DEPTH_TEST);

gl.glClearColor(0, 0, 0, 1);

glu = new GLU();
glut = new GLUT();

gl.glShadeModel(GL2.GL_SMOOTH);

textures = new Texture[6];
TextureData[] data = new TextureData[6];
try {
for (int i = 0; i < textures.length; i++) {
data[i] = TextureIO.newTextureData(drawable.getGLProfile(), new URL(base + textnames[i]), false, "jpg");
textures[i] = TextureIO.newTexture(data[i]);
}

int[] temp = new int[1];
gl.glGenTextures(1, temp, 0);
texname = temp[0];
gl.glBindTexture(GL2.GL_TEXTURE_CUBE_MAP, texname);

gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_WRAP_S, GL2.GL_REPEAT);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_WRAP_T, GL2.GL_REPEAT);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_WRAP_R, GL2.GL_REPEAT);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_MAG_FILTER, GL2.GL_NEAREST);
gl.glTexParameteri(GL2.GL_TEXTURE_CUBE_MAP, GL2.GL_TEXTURE_MIN_FILTER, GL2.GL_NEAREST);

int[] type = { GL2.GL_TEXTURE_CUBE_MAP_POSITIVE_X, GL2.GL_TEXTURE_CUBE_MAP_NEGATIVE_X,
GL2.GL_TEXTURE_CUBE_MAP_POSITIVE_Y, GL2.GL_TEXTURE_CUBE_MAP_NEGATIVE_Y,
GL2.GL_TEXTURE_CUBE_MAP_POSITIVE_Z, GL2.GL_TEXTURE_CUBE_MAP_NEGATIVE_Z
};

for (int i = 0; i < data.length; i++)
gl.glTexImage2D(type[i], 0, data[i].getInternalFormat(), data[i].getWidth(),
data[i].getHeight(), 0, data[i].getPixelFormat(), data[i].getPixelType(),
data[i].getBuffer());

gl.glTexGeni(GL2.GL_S, GL2.GL_TEXTURE_GEN_MODE, GL2.GL_NORMAL_MAP);
gl.glTexGeni(GL2.GL_T, GL2.GL_TEXTURE_GEN_MODE, GL2.GL_NORMAL_MAP);
gl.glTexGeni(GL2.GL_R, GL2.GL_TEXTURE_GEN_MODE, GL2.GL_NORMAL_MAP);

gl.glTexEnvf(GL2.GL_TEXTURE_ENV, GL2.GL_TEXTURE_ENV_MODE, GL2.GL_MODULATE);
gl.glGenerateMipmap(GL2.GL_TEXTURE_CUBE_MAP);

} catch (IOException exc) {
textures = null;
exc.printStackTrace();
}

int v = gl.glCreateShader(GL2.GL_VERTEX_SHADER);
int f = gl.glCreateShader(GL2.GL_FRAGMENT_SHADER);
try {
TextureData text = TextureIO.newTextureData(drawable.getGLProfile(), new File("glassbump.png"), false, "png");
bumpMap = TextureIO.newTexture(text);
BufferedReader brv = new BufferedReader(new FileReader("water.vert.txt"));
String vsrc = "";
String line;
while ((line=brv.readLine()) != null) {
vsrc += line + "\n";
}
String[] src = new String[1];
src[0] = vsrc;
gl.glShaderSource(v, 1, src, null);
gl.glCompileShader(v);

BufferedReader brf = new BufferedReader(new FileReader("regular_water.frag.txt"));
String fsrc = "";
while ((line=brf.readLine()) != null) {
fsrc += line + "\n";
}
src = new String[1];
src[0] = fsrc;
gl.glShaderSource(f, 1, src, null);
gl.glCompileShader(f);

shaderprogram = gl.glCreateProgram();
gl.glAttachShader(shaderprogram, v);
gl.glAttachShader(shaderprogram, f);
gl.glLinkProgram(shaderprogram);
checkLogInfo(gl, shaderprogram);

} catch (Exception e) {

System.out.println(e);
e.printStackTrace();

}

gl.glUseProgram(shaderprogram);
gl.glActiveTexture(GL2.GL_TEXTURE0);
int loc = gl.glGetUniformLocation(shaderprogram, "Texture");
gl.glBindTexture(GL2.GL_TEXTURE_2D, bumpMap.getTextureObject());
gl.glUniform1i(loc, 0);
gl.glActiveTexture(GL2.GL_TEXTURE1);
gl.glBindTexture(GL2.GL_TEXTURE_CUBE_MAP, texname);
loc = gl.glGetUniformLocation(shaderprogram, "Environment");
gl.glUniform1i(loc, 1);
gl.glUniform(new GLUniformData("refraction_index", (float)1.4));

gl.glValidateProgram(shaderprogram);
checkLogInfo(gl, shaderprogram);



Now the info log says some nice things about the vertex and fragment shader linking :). But, the image is still pitch black. I checked the fragment shader by setting the frag color to blue and the sphere is there for sure. The shaders are working. I'm just not sure about anything else.

Would it be easier if I posted pictures?

Edit: Here's some sample pics--

Look how beautiful rendermonkey makes it:
http://img26.imageshack.us/img26/1283/rendermonkey.png (http://img26.imageshack.us/i/rendermonkey.png/)

Look how deformed and ugly I make it:

http://img822.imageshack.us/img822/6192/28414948.png (http://img822.imageshack.us/i/28414948.png/)

Makes me ashamed.

someoney
02-19-2011, 08:47 AM
Perhaps my code is hard on the eyes?

bcthund
02-19-2011, 03:51 PM
I would say the problem is likely with the texture at some stage. It may not be loading properly, you can do some printf debugging to do this.

It could also be that the texture information isn't being passed to your shader, or through all the shader stages properly. Check your "uniform" and "in" bindings and make sure they are all correct. I see you don't use glBindAttribLocation() anywhere, are you forgetting to bind something?

someoney
02-20-2011, 12:03 PM
Yes! A response! There's hope yet~

Would you elaborate regarding the use of printf debugging with textures? I confirmed that the cube map texture is working via cube mapping a sphere and that the texture map is working via texture mapping a rectangle. Did you mean that something went wrong when going to the shader?

Regarding the glBindAttribLocation() call, the shader code I'm working with doesn't have any attribute variables. Are there any hidden attribute variables to be set?

bcthund
02-20-2011, 12:28 PM
For printf, for example I use SDL_image.h to load textures and I can test that it was actually loaded like this:


SDL_Surface * sdlImage;
sdlImage = IMG_Load( file );
if (sdlImage == NULL) { printf("Failed, %s\n\n", SDL_GetError()); return false; }
else printf("Success.\n");


However according to your post your texturing is working with other objects, so yes I would think that something is going wrong when being passed to the shader.

The glBindAttribLocation is used for binding "in" variables such as texture coord arrays and index arrays. Can you post your shader code? That would help to determine the problem.

someoney
02-20-2011, 02:26 PM
For printf, for example I use SDL_image.h to load textures and I can test that it was actually loaded like this:


SDL_Surface * sdlImage;
sdlImage = IMG_Load( file );
if (sdlImage == NULL) { printf("Failed, %s\n\n", SDL_GetError()); return false; }
else printf("Success.\n");


However according to your post your texturing is working with other objects, so yes I would think that something is going wrong when being passed to the shader.

The glBindAttribLocation is used for binding "in" variables such as texture coord arrays and index arrays. Can you post your shader code? That would help to determine the problem.

Not my code (it belongs to... forgot...).

Vertex Shader:



varying vec3 N;
varying vec3 View;
varying vec3 ScreenPos;
varying vec3 ecPosition3;


void main() {

ecPosition3 = vec3(gl_ModelViewMatrix * gl_Vertex);
View = normalize(-ecPosition3);
N = normalize( gl_NormalMatrix * gl_Normal);
gl_TexCoord[0] = 0.6 * vec4(gl_Normal,1);
gl_Position = ftransform();
ScreenPos = vec3(gl_Position);
}


Fragment Shader:


varying vec3 N;
varying vec3 View;

uniform sampler2D Texture;
uniform samplerCube Environment;
uniform float refraction_index;

void myRefract(in vec3 incom, in vec3 normal, in float index_external, in float index_internal,
out vec3 reflection, out vec3 refraction,
out float reflectance, out float transmittance)
{

float eta = index_external/index_internal;
float cos_theta1 = dot(incom, normal);
float cos_theta2 = sqrt(1.0 - ((eta * eta) * ( 1.0 - (cos_theta1 * cos_theta1))));
reflection = incom - 2.0 * cos_theta1 * normal;
refraction = (eta * incom) + (cos_theta2 - eta * cos_theta1) * normal;

float fresnel_rs = (index_external * cos_theta1 - index_internal * cos_theta2 ) /
(index_external * cos_theta1 + index_internal * cos_theta2);


float fresnel_rp = (index_internal * cos_theta1 - index_external * cos_theta2 ) /
(index_internal * cos_theta1 + index_external * cos_theta2);

reflectance = (fresnel_rs * fresnel_rs + fresnel_rp * fresnel_rp) / 2.0;
transmittance = ((1.0-fresnel_rs) * (1.0-fresnel_rs) + (1.0-fresnel_rp) * (1.0-fresnel_rp)) / 2.0;
}

void main() {

vec3 nN = 0.5 * (N + 2.0*(texture2D(Texture, gl_TexCoord[0].xy).rgb - 0.5));

vec3 refraction_ray, reflection_ray;
float fresnel_R, fresnel_T;

myRefract(View, nN, 1.0, refraction_index,
reflection_ray, refraction_ray, fresnel_R, fresnel_T);

refraction_ray = -(gl_ModelViewMatrixTranspose * vec4(refraction_ray,0.0)).xyz;
reflection_ray = -(gl_ModelViewMatrixTranspose * vec4(reflection_ray,0.0)).xyz;

vec4 reflect_color = textureCube(Environment, reflection_ray);
vec4 refract_color = textureCube(Environment, refraction_ray);

fresnel_T = fresnel_T * 0.5;
fresnel_R = fresnel_R * 0.5;

gl_FragColor = reflect_color * fresnel_R + refract_color * fresnel_T;
//gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
}


The code works in render monkey though, so I assume the problem is on my end (w/ my OpenGL code).

bcthund
02-20-2011, 03:03 PM
What happens if you bypass all the myRefract() stuff and apply the texture directly to the sphere, replace:


gl_FragColor = reflect_color * fresnel_R + refract_color * fresnel_T;


With:


gl_FragColor = texture2D(Texture, gl_TexCoord[0]);


You could also try to test the textureCube as well:


gl_FragColor = textureCube(Environment, gl_TexCoord[0]);


This should help determine that the textures are in fact getting to into the shaders as long as the texture coords are working. I'm only familiar with GLSL 3.30 so I am not familiar with gl_TexCoord.

someoney
02-20-2011, 07:42 PM
What happens if you bypass all the myRefract() stuff and apply the texture directly to the sphere, replace:


gl_FragColor = reflect_color * fresnel_R + refract_color * fresnel_T;


With:


gl_FragColor = texture2D(Texture, gl_TexCoord[0]);


You could also try to test the textureCube as well:


gl_FragColor = textureCube(Environment, gl_TexCoord[0]);


This should help determine that the textures are in fact getting to into the shaders as long as the texture coords are working. I'm only familiar with GLSL 3.30 so I am not familiar with gl_TexCoord.

That's brilliant. I didn't even think of testing it that way.

So, I went ahead and made the necessary changes:

Texture:

http://img254.imageshack.us/img254/7619/result1.png (http://img254.imageshack.us/i/result1.png/)

Cube Map:

http://img811.imageshack.us/img811/6544/result2.png (http://img811.imageshack.us/i/result2.png/)

What's even more interesting is that I could have implemented cube mapping much more easily in the fragment shader than in opengl >.>.

Well, the textures are being passed alright to the shader. Ah, perhaps it's the normals?

Perhaps I should auto gen the normals? No, tested it by turning on GL_AUTO_NORMAL and that did nothing.

I'm really grasping at straws right now.

bcthund
02-20-2011, 08:29 PM
Good, we know the textures are getting there. You can easily test if it's the normals that are wrong now by using the same method as for the textures. Your normals should change across the surface of the sphere and that should give you a rainbow type effect across it's surface. Pass your gl_Normal from the vertex shader into the fragment shader and then use the command:

Vertex Shader


varying vec3 N;
varying vec3 View;
varying vec3 ScreenPos;
varying vec3 ecPosition3;

out vec3 vNormal; // <--- ADD THIS
void main() {
vNormal = gl_Normal; // <--- ADD THIS
ecPosition3 = vec3(gl_ModelViewMatrix * gl_Vertex);
View = normalize(-ecPosition3);
N = normalize( gl_NormalMatrix * gl_Normal);
gl_TexCoord[0] = 0.6 * vec4(gl_Normal,1);
gl_Position = ftransform();
ScreenPos = vec3(gl_Position);
}



Add an input to the fragment shader:

in vec3 vNormal;

and change the gl_FragColor assignment:

gl_FragColor = vec4(vNormal.xyz, 1.0);

someoney
02-20-2011, 08:43 PM
http://img838.imageshack.us/img838/6676/testlsr.png (http://img838.imageshack.us/i/testlsr.png/)

Wow! How pretty, now I know what my next project will be.

But, as you have mentioned, due to the rainbow the normals seem to be correct.

Hmmm... what else...

bcthund
02-20-2011, 08:51 PM
This can only mean there is something going wrong somewhere in the shaders. It could be hard to track but, you'll have to add stages of the shaders one at a time until you find a part that produces incorrect results. Process of elimination in the shader is really all you can do now.

Try setting all the original colors one at a time and see if any of them work to start:


gl_FragColor = reflect_color;




gl_FragColor = fresnel_R;




gl_FragColor = refract_color;




gl_FragColor = fresnel_T;

someoney
02-21-2011, 12:56 PM
Huh, I tested it as you suggested.

And you know what?

The shader works. Granted it looks like cube mapping, but...

I tested a bunch of other variables as well to determine why it's so dark and it seems:

http://img211.imageshack.us/img211/3872/blahg.jpg (http://img211.imageshack.us/i/blahg.jpg/)

That is the image generated when setting the view vector as the color.

The sphere is centered at the origin with a radius of 1. The camera is at (0, 0, 2). I feel that the coloring is incorrect even in world space coordinates. For instance, since the up vector is +y, shouldn't the green be on the top?

Meanwhile, render monkey renders:

http://img140.imageshack.us/img140/9977/rendermonkey.jpg (http://img140.imageshack.us/i/rendermonkey.jpg/)

The coloring is that same way no matter how I rotate the camera. I'm not sure how to explain the blue... I suppose it could be a result of the viewing transformation.

In that case, would that mean that the view vector is not correctly calculated?

Wait I can test that. Just move the object~

Huh, interesting. When I move it left, the sphere turns red. Right, and it turns half green half blue.

I think this means that the view vector is being calculated incorrectly; rather all it's doing is using the vertex position as the color rather than the view direction.

So the logical question would be: how to set up the shader so that the model view matrix x vertex position would result in a correct view vector.

Why is it giving the incorrect view vector?

bcthund
02-21-2011, 01:13 PM
I haven't been using linear algebra very long but from looking up some information on view vectors it seems that in order to get the view vector the math is:


viewVector = modelView - modelPosition;
Normalize(viewVector);


If this is the code for you view vector then it is wrong if the above formula is correct:


ecPosition3 = vec3(gl_ModelViewMatrix * gl_Vertex);
View = normalize(-ecPosition3);


Try changing this to the below to test this:


ecPosition3 = vec3(gl_ModelViewMatrix - gl_Vertex);
View = normalize(ecPosition3);


Ignore this, I found the proper code showing that your view vector is being calculated correctly, and as Alfonse pointed out the math doesn't work anyway.

Alfonse Reinheart
02-21-2011, 01:31 PM
You can't subtract a vector from a matrix.


Not my code (it belongs to... forgot...).

You should know how the shaders you use work. These shaders have some problems.

First, the interface between them is very strange. It passes varyings that are never used (ecPosition and ScreenPos). I have no idea how it links.

Second, it interpolates the view direction. Since you're going to interpolate three values anyway, you may as well interpolate the eye-space position (ie: ScreenPos) and compute the direction with normalizing as needed. You have to normalize the view direction after interpolating anyway, so it doesn't even take performance. Plus, you get to do things like take distances for attenuation and so forth.

Third, it uses this function "myRefact" instead of the actual standard GLSL function "refract". I don't know why, and I haven't looked long enough to know whether or not it implements it correctly. But it's generally bad form to re-implement GLSL standard functions.

someoney
02-21-2011, 02:11 PM
Huh, fascinating!

As you suggested, explicitly setting the camera position works.

Result:
http://img546.imageshack.us/img546/1363/refract.png (http://img546.imageshack.us/i/refract.png/)

About there~

That still doesn't explain the reason why the model view matrix x the vertex coordinate doesn't work. Perhaps I set up the matrix incorrectly?

Secondly, shouldn't disabling the shaders via glUseProgram(0) result in fixed pipeline? Afterwards, shouldn't I be able to re-enable textures and texture map the cube normally?

Alfonse Reinheart
02-21-2011, 02:34 PM
Secondly, shouldn't disabling the shaders via glUseProgram(0) result in fixed pipeline? Afterwards, shouldn't I be able to re-enable textures and texture map the cube normally?

If you're talking about multipassing (rendering the sphere again), then that requires that you use ftransform(), which is invariant with the fixed-function pipeline. Otherwise, there is no guarantee of invariance.

someoney
02-21-2011, 02:58 PM
Nah, I meant for the cube. I render the sphere using the shader, then return to the fixed pipeline for the cube.

The cube isn't suppose to be just green after all~

someoney
02-22-2011, 08:07 AM
Huh, maybe this problem isn't so simple? Perhaps its an API issue rather than my coding issue?

enjoycrf
02-22-2011, 04:10 PM
i am also having problems with shaders
they make my quake3 level look like mario cart
wthhhhhhhh

someoney
02-22-2011, 04:40 PM
i am also having problems with shaders
they make my quake3 level look like mario cart
wthhhhhhhh

If that was a request for help, I'll comment that I'm nowhere near qualified to answer questions on shaders, especially since I'm having problems of my own.

Nonetheless, did you already create a topic for your problem?

enjoycrf
02-22-2011, 04:53 PM
hahaha we suck lol
im jus jumping around my level
i have some bumpy collision too lol
gona work on importing a model for now

someoney
02-22-2011, 07:24 PM
You can't subtract a vector from a matrix.


Not my code (it belongs to... forgot...).

You should know how the shaders you use work. These shaders have some problems.

First, the interface between them is very strange. It passes varyings that are never used (ecPosition and ScreenPos). I have no idea how it links.

Second, it interpolates the view direction. Since you're going to interpolate three values anyway, you may as well interpolate the eye-space position (ie: ScreenPos) and compute the direction with normalizing as needed. You have to normalize the view direction after interpolating anyway, so it doesn't even take performance. Plus, you get to do things like take distances for attenuation and so forth.

Third, it uses this function "myRefact" instead of the actual standard GLSL function "refract". I don't know why, and I haven't looked long enough to know whether or not it implements it correctly. But it's generally bad form to re-implement GLSL standard functions.

I apologize. I completely missed your response! True, I should know how it works considering I'm attempting to implement them (I'll admit, I was trying to avoid that).

The varying variables ecPosition and ScreenPos are actually used in a different fragment shader. It isn't used in the current one though, so it may be safely removed from this vertex shader.

I am not familiar with the refract function call, is there a GLSL API to reference? I have read on another forum that the implementation of the refract function is not well supported on ATI cards. Since I have an ATI card, it's a good opportunity to find out.

Edit: I found the manual. I'll try rewriting the code in my own terms.

Edit:

Vertex shader:


varying vec3 Pos;
varying vec3 N;

void main() {

Pos = vec3(gl_Vertex);
N = normalize(gl_NormalMatrix * gl_Normal);
gl_TexCoord[0] = 0.6 * vec4(gl_Normal,1);
gl_Position = ftransform();

}


Fragment shader:


varying vec3 Pos;
varying vec3 N;

uniform sampler2D Texture;
uniform samplerCube Environment;
uniform float refraction_index;
uniform vec3 camera;

void main() {

vec3 nView = normalize(camera - Pos);
vec3 nN = 0.5 * (N + 2.0*(texture2D(Texture, gl_TexCoord[0].xy).rgb - 0.5));
vec3 refract_ray = refract(nView, nN, refraction_index);
gl_FragColor = textureCube(Environment, refract_ray);
}


Result?

http://img97.imageshack.us/img97/9113/wrooong.png (http://img97.imageshack.us/i/wrooong.png/)

That can't be right.

someoney
02-23-2011, 05:05 PM
Oh, I see.

The incident vector is towards the object thus position - camera. Secondly, the refract function takes in eta: the ratio between the outside and inside index of refraction.

Works~

http://img97.imageshack.us/img97/1363/refract.png (http://img97.imageshack.us/i/refract.png/)

Now I just have to figure out:

1) Why doesn't the modelview matrix x vertex produce view vector
2) Why returning to fixed pipeline does not allow for texture mapping regularly.

Alfonse Reinheart
02-23-2011, 05:21 PM
1) Why doesn't the modelview matrix x vertex produce view vector

It does. However:


vec3 nView = normalize(camera - Pos);

The "camera" and "Pos" are in two different spaces. Pos is derived directly from gl_Vertex, which is in model space (the space of positions before multiplying them with the modelview matrix). "camera" is in whatever space you put it in. I'm guessing that this is not model space.

Also, what's with the multiplication of the normal by 0.6? What purpose does that serve?

someoney
02-23-2011, 06:40 PM
The "camera" and "Pos" are in two different spaces. Pos is derived directly from gl_Vertex, which is in model space (the space of positions before multiplying them with the modelview matrix). "camera" is in whatever space you put it in. I'm guessing that this is not model space.

Ironically, it works out because there were no transformations on the sphere. But, as you said, it isn't correct because as soon as I move the sphere (or apply any transformations), the image is still the same.

I believed that one of the dependencies for the view vector is the camera position. In that case, as I rotate around the sphere, the image should change. But, it did not which led me to suspect that I was setting something incorrectly. As I could not figure out the "proper" set up, I elected to take a short cut. Was I wrong?



Also, what's with the multiplication of the normal by 0.6? What purpose does that serve?

No idea. Took it out, and it's still the same.