Resizing a texture?

I have a problem, I need to be able to resize a texture in openGL without using the gluScaleImage() function. Right now I have access to the image Height, witdh ans bpp and also the huge array of pixels (a array of height * witdh * bpp/8 size). How could I resize the texture using these informations!?

Dear Soum,

Please read some previous threads on the same topic.Give a search for Resizing Texture.

or try this (this is taken one of the threads)

GLuint newimage[newheightnewwidth];
GLuint oldimage[height
width];

for(i=0;i<newheight*newwidth;i++)
{
/*You must define your operation say
minimizing or maximizing the texture
for minimizing you can specify pixel<<1 */

newimage[i] = oldimage[operation];

}

Hope this helps you.
Rajesh.R
IRIS,CAIR
Bangalore

Originally posted by Soum:
I have a problem, I need to be able to resize a texture in openGL without using the gluScaleImage() function. Right now I have access to the image Height, witdh ans bpp and also the huge array of pixels (a array of height * witdh * bpp/8 size). How could I resize the texture using these informations!?
Use gluBuild2DMipmaps() and it will resize the texture for you.

Thx to both for the quick answer. As far as the gluBuild2DMipmaps() solution, it doesn’t really work for me since I need to modify the orignal texture array…to eventualy be able to save the new resized texture to another tga file. For the other answer well the code you wrote is helpful but I pretty much already knew that…what I need to find is the algorithm that will go between the {} to do the actual resizing. Again thx for the help…I’ll continue my search now.

Soum

Ok so I checked the Titan engine as suggested in a previous thread and I got that function:

void ScaleImage(GLenum format, INT inwidth, INT inheight, const BYTE *in,
				 INT outwidth, INT outheight, BYTE *out ) {

	INT			i, j;
	const BYTE	*inrow;
	DWORD		frac, fracstep;

	fracstep = inwidth*0x10000/outwidth;

	switch(format){
		case GL_RGB:
			for(i=0; i<outheight; i++, out+=outwidth*3) {
				inrow = in + 3*inwidth*(i*inheight/outheight);
				frac = fracstep >> 1;
				for(j=0; j<outwidth*3; j+=3) {
					out[j]   = inrow[(frac>>16)*3];
					out[j+1] = inrow[(frac>>16)*3+1];
					out[j+2] = inrow[(frac>>16)*3+2];
					frac += fracstep;
				}
			}
			break;
		case GL_RGBA:
			for(i=0; i<outheight; i++, out+=outwidth*4) {
				inrow = in + 4*inwidth*(i*inheight/outheight);
				frac = fracstep >> 1;
				for (j=0 ; j<outwidth*4 ; j+=4) {
					out[j]   = inrow[(frac>>16)*4  ];
					out[j+1] = inrow[(frac>>16)*4+1];
					out[j+2] = inrow[(frac>>16)*4+2];
					out[j+3] = inrow[(frac>>16)*4+3];
					frac += fracstep;
				}
			}
			break;
		case GL_LUMINANCE:
		case GL_ALPHA:
			for(i=0; i<outheight; i++, out+=outwidth*4) {
				inrow = in + 1*inwidth*(i*inheight/outheight);
				frac = fracstep >> 1;
				for (j=0; j<outwidth*1; j+=1) {
					out[j]   = inrow[(frac>>16)*1  ];
					frac += fracstep;
				}
			}
		break;
		case GL_LUMINANCE_ALPHA:
			for(i=0; i<outheight; i++, out+=outwidth*4) {
				inrow = in + 2*inwidth*(i*inheight/outheight);
				frac = fracstep >> 1;
				for(j=0; j<outwidth*1; j+=2) {
					out[j]   = inrow[(frac>>16)*2  ];
					out[j+1] = inrow[(frac>>16)*2+1];
					frac += fracstep;
				}
			}
		break;
	}
}

Now I need some help to understand it. The In height and width are easy, I already have a variable for each side for the original texture. The problem is more for the “in” array. I guess its the array containing the RGB value for each pixels. The thing is on my tga loader, it load the data in a unsigned char array. I tried using the previous function with the unsigned char array but it didn’t worked. I then tried to create a BYTE buffer and copy everything from the unsigned char one to the BYTE one (using a for or simply a =). Now the app compile but its doing some really “funky” things with the textures. And the result is nowhere near the original texture. Anybody have an idea that could help me?

Thx a lot!

Why don’t you just take the function you posted and replace every occurance of “BYTE” by “unsigned char”? :wink:

Thx for the reply…but yeah I tried that too…it compile but the original texture of 512X512 is a simple white texture with the word “Texture 24bits”…and its 24bits of course. But the result is a texture of the right size (256X256 for exemple) but with black and violet verticle stripes! :mad: Argh…been trying to do that function for a week now.

You need to understand the concepts about image resizing. The simple one is the linear interpolation: you take the average of two pixels that should only make one to make the new final one (it can also work for 3 pixels or more depending on the resize factor).
There’s also the bi-linear interpolation (or quadratic), in which case you take all the pixels around the pixel (in the 2 dimensins) that should die and make an average of those to create the new one.

This is not really of interrest of GL. But if you need to do so, then gluScaleImage should definately give pretty good result without making any headaches rising in your head.