Slow performance with non power of 2 texture

Hi All,

Why am I getting such low FPS when I try to draw a Non power of 2 texture? I am using a Radeon Mobility x700 which supports OpenGL 2.0.

I initialise the textures and OpenGL where I already have a bitmap loaded into memory:

glGenTextures( 1, &this->texId );
glBindTexture( GL_TEXTURE_2D, this->texId );
glTexImage2D(GL_TEXTURE_2D, 0, 3, imgWidth(), imgHeight(), 0, GL_RGB, GL_UNSIGNED_BYTE, imgBuffer );
	glEnable(GL_TEXTURE_2D);						
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glClearColor( 0.0f, 0.0f, 0.0f, 0.0f );
glShadeModel(GL_FLAT);

this->object = glGenLists( 1 );

glNewList( this->object, GL_COMPILE );

glTranslatef(-1.5f,0.0f,-15.0f);	
glBegin(GL_QUADS);					
	glTexCoord2f(0.0f, 0.0f);
	glVertex3f(-1.0f, 1.0f, 0.0f);				// Top Left
	glTexCoord2f(1.0f, 0.0f);
	glVertex3f( 1.0f, 1.0f, 0.0f);				// Top Right
	glTexCoord2f(1.0f, 1.0f);
	glVertex3f( 1.0f,-1.0f, 0.0f);				// Bottom Right
	glTexCoord2f(0.0f, 1.0f);
	glVertex3f(-1.0f,-1.0f, 0.0f);				// Bottom Left
glEnd();							
glEndList();

I then run a timer which updates the scene:


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glCallList( this->object );

If I use an image with a resolution of 512x512 then everything runs nice and fast, if I use an image of 500x500 then it just slows down to like 5 FPS!

What am I doing wrong?

Thanks in advance
Andy

Sounds like you’re hitting a software path. Check your extension string for NPOT support?

Just tried delphi3d.net for support on that card, but they seem to be experiencing difficulties. Anyone know if that site is still being maintained?

Radeon Mobility x700 supports OpenGL 2.0 yes, but NPOT is not done in hardware … not your fault.
Check this :
http://www.opengl.org/wiki/NPOT_Textures
If you can live with GL_CLAMP_TO_EDGE and no mipmaps, you may use NPOT like on the RECTANGLE extension.
Otherwise, pad your texture up to POT, and adjust texture coordinates accordingly.

Looks like the server is crashed for days even weeks (months, who knows?)… since the last time I went on this site.

Useful utilites are for me, glxinfo on linux or realtech glview windows and mac.

I miss the hw database they have - mighty nice if you’re on a limited budget :wink:

Yes that was really useful… although I have found the direct link to the hardware registry which is finally accessible! But it does not look like to be still maintained… too bad.

I have seen on realteach site that glview should run on linux with wine. Just for fun I have tried to install it Ubuntu 8.04. It requires .net framework whose installation program literally crash… oh no! That is very surprising! Anyway, I should not have to bother with that. :slight_smile:

Who has the time and energy to whip up something like this in the wiki? Is there a way to make submission of hardware configs and capabilities really easy and readily available to everyone?

I’m too busy picking the leaves out of my tropical tea.

Thanks for the clarifictation. How does one detect whether NPOT is supported in the hardware and is not implemented in software? Do I simply time a frame and make an assumption on that?

If your hardware isn’t doing what you think it ought to be doing, please tell your vendor about it - only way to change anything.

I am intending to stream video into the texture so padding or resizing the image is something I want to avoid. I am intending to support OpenGL 2.0 upwards but I want to avoid the issue of running into a software path, as I have with the current card I am working with. Is there a way to detect at runtime if the NPOT is being rendered using software or hardware? I am assuming I can use the RECTANGLE extension for when it runs into a software path.

glew ? Or sorry if I’m wrong :slight_smile:

It should run in hw mode if you are following the code example on the wiki. GL_TEXTURE_RECTANGLE will run in hw mode on all GPUs.

To see at runtime if hardware supports NPOT textures you can call glGetString(GL_EXTENSIONS) which returns a space separated list of supported extensions. See if one of GL_ARB_texture_non_power_of_two or GL_ARB_texture_rectangle is in there. If not, your hardware does not support NPOT texture at all.

if there is only GL_ARB_texture_rectangle, NPOT texture are only supported throw the GL_TEXTURE_RECTANGLE target. There are also some limitations about filtering and mipmapping.

With GL_ARB_texture_non_power_of_two NPOT texture support is totally transparent.

If your hardware does not support NPOT textures, you can also use some hacks: split texture (quite diffcult, could lead to fair amount of textures) or use simply the closest POT texture that can contain your NPOT texture.

Where is the code example?

The older non-.NET version works just as well. Realtechvr