Hello,
i have some issues with Cubemaps which i don’t fully understand.
At first there is a simple Skybox, a large Cube that is texture mapped with OpenGL’s cubemapping capabilities. The 3D texturecoordinates are extracted from the cubes vertexcoordinates and interpolated between vertex- and fragmentshader.
The scene is rendering very well, but i think there is something wrong with the coordinate system.
The camera (view matrix) is located in the center and looking at (0,0,-1) corresponding to OpenGl’s right handed coordinate system. But at that position i’m looking at the image that is related to the cubemaps TextureCubeMapPositiveZ. I would expect to see the negative-z image because i’m looking at the negative z-axes.
X and Y are fine, positive-x at my right, negaitve-x at the left, positive-y at the top, negative-y at the bottom.
Is that orientation correct or is there anything wrong?
Related to this problem there is another one. In the center of the skybox i’m rendering a small sphere that’s using the cubemap for static reflections. At first i got some weird reflections that didn’t seem to be correct: The reflected parts of the cubemap were not (!) mirrored. With some random changes i finally found the right setting, i just inverted the z-coordinate in the fragmentshader and suddenly the reflection works as expected:
vFragColor *= texture(cubeMap, vec3(vVaryingCubeTexCoords.x, vVaryingCubeTexCoords.y, -vVaryingCubeTexCoords.z));
Again some issue with z-coordinate that’s irritating me but seems to be related to the first one.
Maybe someone is able to bring some enightment into this,
thanks.