Hello everybody, I know this question have been asked quite a few times now, but so far I haven't found a proper answer.
I have a program where I load a mesh from a wavefront (.obj) file exported from blender. So far my program can render the mesh, including texture, but the texture is not rendered correctly because I was attempting to pass the UV coordinates to OpenGL using GL_ARRAY_BUFFER. My program's logic is somehow correct; as long as I have the same number of UV coordinates as I have vertices and both the vertices and the UV arrays are ordered so that they can both be accessed by the same index, the texture will be rendered correctly. But it is unlikely that these 2 conditions will meet for most cases. Most of the times, I have a different number of UV coordinates than I have vertices and I have no idea of how blender organizes the vertices but they are definitely not in the same order as the UV coordinates.
So far I have an array of vertices, an array of UV coordinates and 2 array of indices, one for the vertices and another for the UV coordinates. Is there some way I can give OpenGL the UV coordinates and their indices in the same fashion as I do with the vertices when doing indexed drawing (using one buffer for the UV coordinates, and another for the UV indices)?