Hi,
I created a sphere by normalizing all vertices of a subdivided cube. I put the data in two arrays, one for the vertices and the other for the indices. To draw it I create the buffers once and call a drawing function every time it’s necessary.
const unsigned int num_vertices = 1538;
const unsigned int num_faces = 3072;
GLfloat vertices[] = {
1.000000, -1.000000, -1.000000,
…
-0.875000, 0.875000, -1.000000
};
GLuint faces[] = {
465, 1537, 449,
…
393, 413, 1154
};
void createCube() {
glGenBuffers(1, &vbuffer_id);
glBindBuffer(GL_ARRAY_BUFFER, vbuffer_id);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices),
vertices, GL_STATIC_DRAW);
glGenBuffers(1, &ibuffer_id);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibuffer_id);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(faces),
faces, GL_STATIC_DRAW);
}
void drawCube()
{
glEnableClientState(GL_VERTEX_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, vbuffer_id);
glVertexPointer(3, GL_FLOAT, 0, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibuffer_id);
glDrawElements(GL_TRIANGLES, 3*num_faces,
GL_UNSIGNED_INT, 0);
}
I use a vertex shader to create on the fly texture coordinates
using the inverse parametric equation of the sphere, which works mostly fine as you can see in the attached image.
gl_TexCoord[0].s = 0.5 + atan(v.z, -v.x)/(2.0*pi);
gl_TexCoord[0].t = 0.5 + asin(v.y)/pi;
The problem with this approach is that it assigns the same s texture coordinate to the leftmost and rightmost vertices in the texture space, which in the model space are actually the same vertices but used for different faces (first and last in a row). That’s the cause of the mess you see at the extreme meridian edge.
Using a geometry shader I would know if the triangle is the westmost or the eastmost and then I would be able to correct the texture coordinate but I’m almost sure there’s a simpler solution using only vertex and fragment shaders.
Any idea please?