generating texture coordinates

I’m having problem figuring out how to generate texture coordinates because all the texture info I find on opengl tends to be a single quad with 0.1, 1.1, 1.0, 0.0… Which is not hard to understand but I’m trying to generate text coordinates for a heightmap terrain. How can I calculate these texture coords so i can test if my formula works?

Edit:
I figured if I just used (x / image width) and (z / image height) I could generate texture coordinates that would stretch the texture over the entire terrain but it’s not drawing texture at all.

This is how I store my data array, vertice, normals, texture coords, and indice.
img = SDL_Surface which is just a grayscale image I use for a heightmap. (I’m gonna change this to a different format soon so I can access the RGB values quicker.)


	for(int z = 0; z < img->h; z++)
	{
		for(int x = 0; x < img->w; x++)
		{
			pixel = ((Uint32*)img->pixels)[z * img->pitch / 4 + x];
			temp = pixel & fmt->Rmask; 
			temp = temp >> fmt->Rshift;
			temp = temp << fmt->Rloss; 
			r = (Uint8)temp/255.0;

			vec.x = x;
			vec.y = r;
			vec.z = z;
			
			//Vertex data
			heightMap.push_back(vec.x * tileSize);
			heightMap.push_back(vec.y * maxHeight);
			heightMap.push_back(vec.z * tileSize);

			//Normals
			vec.normalize();
			heightMap.push_back(vec.x);
			heightMap.push_back(vec.y);
			heightMap.push_back(vec.z);

			//Texture coordinates
			heightMap.push_back((float)x / img->w);
			heightMap.push_back((float)z / img->h);

			//index
		    if(x < img->w-1 && z < img->h-1)
			{
				int top = z * img->w + x;
				int bottom = top+img->w;//(z+1)*img->w + x;
 
				index.push_back(top);
				index.push_back(bottom);
				index.push_back(top+1);
 
				index.push_back(bottom);
				index.push_back(bottom+1);
				index.push_back(top+1);
			}
		}
	}

I generate my texture.

unsigned int loadTexture(const char* name)
{
	SDL_Surface* img = IMG_Load(name);
	SDL_PixelFormat form={NULL,32,4,0,0,0,0,8,8,8,8,0xff000000,0x00ff0000,0x0000ff00,0x000000ff,0,255};
	SDL_Surface* img2 = SDL_ConvertSurface(img,&form,SDL_SWSURFACE);
	unsigned int texture;

	glGenTextures(1,&texture);
	glBindTexture(GL_TEXTURE_2D,texture);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, img2->w, img2->h, 0, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8, img2->pixels);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
	glBindTexture(GL_TEXTURE_2D, 0);
	SDL_FreeSurface(img);
	SDL_FreeSurface(img2);
	return texture;
}

I just call loadTexture when I create my terrain VBO and IBO. (I believe a GL_ELEMENT_ARRAY_BUFFER is called IBO (index buffer object), correct me if I’m wrong.)

glGenBuffers(1, &vboModel);
	glBindBuffer(GL_ARRAY_BUFFER, vboModel);
	glBufferData(GL_ARRAY_BUFFER, heightMap.size()*sizeof(float), heightMap.data(), GL_STATIC_DRAW);
	glBindBuffer(GL_ARRAY_BUFFER, 0);

	glGenBuffers(1, &vboModelInd);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboModelInd);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, index.size()*sizeof(unsigned int), index.data(), GL_STATIC_DRAW);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

	textureId = loadTexture("grass.bmp");//textureId is a member variable for my heightmap class unsigned int textureId;

This is how I draw the terrain. The vertex and normals are displayed fine but the texture does not show up.

glBindBuffer(GL_ARRAY_BUFFER, vboModel);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboModelInd);
		
			glEnableClientState(GL_VERTEX_ARRAY);	
			glVertexPointer(3, GL_FLOAT, 8*sizeof(float), NULL);

			glEnableClientState(GL_NORMAL_ARRAY);
			glNormalPointer(GL_FLOAT, 8*sizeof(float), (void*)(3*sizeof(float)));

			glEnableClientState(GL_TEXTURE_2D_ARRAY);
			glTexCoordPointer(2, GL_FLOAT, 8*sizeof(float), (void*)(6*sizeof(float)));

			glBindBuffer(GL_TEXTURE_2D, textureId);
			glDrawElements(GL_TRIANGLES, index.size(), GL_UNSIGNED_INT, (void*)0);

		glDisableClientState(GL_VERTEX_ARRAY);
		glDisableClientState(GL_NORMAL_ARRAY);
		glDisableClientState(GL_TEXTURE_2D_ARRAY);

	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
	glBindBuffer(GL_TEXTURE_2D, 0);

Does this look about right to you guys? I’m displaying this to the screen with a grass texture now but the texture just shows up as a off green color.

//vertex + texture coordinate array
float vertex[] = { 
	//first 3 = vertex, other 2 are texture coordinates
	0, 0, 0,	0.0, 0.0, 
	1, 0, 0,	0.5, 0.0,
	2, 0, 0,	1.0, 0.0,

	0, 0, 1,	0.0, 0.5,
	1, 0, 1,	0.5, 0.5,
	2, 0, 1,	0.1, 0.5,

	0, 0, 2,	0.0, 1.0,
	1, 0, 2,	0.5, 1.0,
	2, 0, 2,	1.0, 1.0
};

//Index for the vertex array
unsigned int index[] = { 
	0, 3, 1,
	3, 4, 1,
	1, 4, 2,
	4, 5, 2,
	
	3, 6, 4,
	6, 7, 4,
	4, 7, 5,
	7, 8, 5

};

Texture coordinates specify where on the image for extract the pixel. They have a value between 0-1 regardless on the image size - values outside this range will be modulo’d if wrapping is selected or set to 0 if < 0, 1 > 1 if clipped to edge is selected. The value represents a fraction of the image dimension.

The code style you are using is very old but I don’t think it is a good idea to call glDisableClientState before you unbind the buffer

I understand what they are but when dealing with so many vertice its a bit confusing at first. I’m pretty sure the way I have the texture coords now will work after I get the correct way to draw it but I think it will be upside down.

I will try what you think may be the issue tomarrow. As far as the old style goes I planned on using opengl 2.1 with fixed pipeline so I can just learn all the buffer object and such but I think I may switch to 3.3 soon as it seems to be well supported.

Unbinding the buffer before the client state disable didn’t change the outcome. I’ve tried a lot of different things but I’m lost as to why this isn’t working.

[QUOTE=Exempt;1254053]Does this look about right to you guys? I’m displaying this to the screen with a grass texture now but the texture just shows up as a off green color.

//vertex + texture coordinate array
float vertex[] = { 
	//first 3 = vertex, other 2 are texture coordinates
	0, 0, 0,	0.0, 0.0, 

[/QUOTE]
Did you adjust the strides and offsets in the gl*Pointer calls to allow for the lack of normals?

Yeah, that was just a small test my code does have normals for the terrain but I’ve still not managed to get this working. If you have time would you please draw a 3x3 grid and show me how you would define the texture coords for each vertex. I’m probably just going about it wrong.

The numbers for the vertex and index arrays are correct. One problem with the code is:


	glBindBuffer(GL_TEXTURE_2D, textureId);

and


	glBindBuffer(GL_TEXTURE_2D, 0);

which should be using glBindTexture() rather than glBindBuffer().

I have that replaced now but sadly still not working right. :frowning: hmmmm

Also:


		glEnableClientState(GL_TEXTURE_2D_ARRAY);


		glDisableClientState(GL_TEXTURE_2D_ARRAY);

These should be GL_TEXTURE_COORD_ARRAY.

GL_TEXTURE_2D_ARRAY is a texture target (i.e. the target parameter to glTexImage3D() etc) for array textures, not a vertex array.

Are you using an IDE with auto-completion?

Ah, i didn’t see that in the docs.

I’m using vs 2010 and I don’t have auto complete if its possible

I haven’t got it working on the heightmap yet but in my small test that I setup it’s showing a texture now though the color seems inverted somehow. I’m sure it’s cause I’m using a BMP but I’m having issues loading jpg or png into textures with SDL atm. Now that I can at least get it on a small 3x3 grid I should be able to port this to my heightmap after I solve the color issue. Thanks for taking the time to find these problems, without compiler errors or runtime errors it’s freakin’ hard to find these small mistakes.

edit:
tada! lol. Took forever but got it working with a “bit” of help.

For anyone else using SDL_image 1.2 on windows at least it doesn’t seem to load the different file format .dll’s and they have to be added to the directory of the .exe. It was allowing me to load BMP because that one always work for w/e reason but the rest didn’t and was causing weird issues and crashing / unhandled access violation errors.

The reason being either that BMP is Windows’ native bitmap format, so it’s using the loader which is built into the OS, or that the BMP format is so simple (a couple of "struct"s followed by raw pixel data) that it’s built into SDL_Image rather than needing a separate library. The downside of it being that simple is that there’s no compression, so BMPs are huge compared to an equivalent PNG or JPEG file.

PNG is what i plan to use because its small and still looks good. Working on multi texturing the terrian now.