Problem rendering floating-point textures

Could anyone figure out what is wrong with the code supplied below and why my floating-point texture doesn’t work. Running this simple program only shows a plain white quad. I have tried everything I can come up with but this problem beats me. I have a ATI mobility Radeon X600 graphics card (if that helps).

Thanks!

  
#ifdef WIN32 
#include <windows.h>
#endif

#include <stdlib.h>
#include <stdio.h>
#include <GL/glew.h>
#include <gl/glu.h>
#include <gl/glut.h>

using namespace std;


    GLuint tex;	
    int texSize = 2;
    float* data = (float*)malloc(4*texSize*texSize*sizeof(float));

void update(void)
{
	glutPostRedisplay();
}


void display(void) 
{
	glClear(GL_COLOR_BUFFER_BIT);
	
	glEnable(GL_TEXTURE_RECTANGLE_ARB);
	glBindTexture(GL_TEXTURE_RECTANGLE_ARB,tex);
	glBegin(GL_TRIANGLES);
	glTexCoord2d(0.0, 0.0);
	glVertex3d(-0.5, -0.5, 0.0);
	glTexCoord2d(1.0, 0.0);
	glVertex3d(0.5, -0.5, 0.0);
	glTexCoord2d(1.0, 1.0);
	glVertex3d(0.5, 0.5, 0.0);
	
	glTexCoord2d(0.0, 0.0);
	glVertex3d(-0.5, -0.5, 0.0);
	glTexCoord2d(1.0, 1.0);
	glVertex3d(0.5, 0.5, 0.0);
	glTexCoord2d(0.0, 1.0);
	glVertex3d(-0.5, 0.5, 0.0);
	glEnd();
	glutSwapBuffers();
	glDisable(GL_TEXTURE_RECTANGLE_ARB);
}


void reshape(int w, int h) 
{
}


void createTextures() 
{
    for (int i=0; i<texSize*texSize*4; i++)
        data[i] = i+1.0;
  

    glGenTextures (1, &tex);

    glBindTexture(GL_TEXTURE_RECTANGLE_ARB, tex);
	
    glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, 
                    GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, 
                    GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, 
                    GL_TEXTURE_WRAP_S, GL_CLAMP);
    glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, 
                    GL_TEXTURE_WRAP_T, GL_CLAMP);
  
    glTexImage2D(GL_TEXTURE_RECTANGLE_ARB,0,GL_RGBA32F_ARB,
                 texSize,texSize,0,GL_RGBA, GL_FLOAT, data);
  
	
}


void keyboard(unsigned char key, int x, int y)
{}


void main(int argc, char **argv) {
	
	glutInit(&argc, argv);
		
	glutInitDisplayMode( GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);

	glutInitWindowSize(400, 400);
	glutInitWindowPosition(100, 100);
	glutCreateWindow("OpenGL Test 1.0");

	glewInit();

	createTextures();

	glutDisplayFunc(display);
	glutReshapeFunc(reshape);
	glutKeyboardFunc(keyboard);
	glutIdleFunc(update);
	glutMainLoop();
}

Multiple problems:

  • Problem 1: for (int i=0; i<texSizetexSize4; i++) data[i] = i+1.0;
    This means all colors have values bigger than 1.0 on their components. Rendering this to a fixed point color buffer will result in “whiter-than-white” colors clamped to white.
  • Problem 2: Texture rectangles need unnormalized texture coordinates [0, width] x [0, height], not [0, 1] range like for GL_TEXTURE_2D. You textured your image with only the bottom left texel if the download succeeded.
  • Add a glGetError after the glTexImage2D call. Did it succeed?
  • Maybe texture rectangles don’t support RGBA32F on ATI. Check if you have ARB_texture_non_power_of_two suppprt and try GL_TEXTURE_2D and leave the texcoords in the [0, 1] range.
  • You should use the floating point variants of glVertex and glTexCoord instead of doubles for performance reasons.

If you are using texture rectangle, then your texture coordinates should not be in 0…1 range. In rectangles you specify coordinates in texels.

Besides - GL_ARB_texture_rectangle is not (at least not oficially) supported by any Radeon.

I suggest using just TEXTURE_2D - Radeon supports non power of two textures (with some limitations). GeForce supports it fully since GeForce 6.

If you want compatibility with GeForce FX you will need to use EXT_texture_rectangle or NV_texture_rectangle. They’re identical extensions, but NVIDIA GPUS only report the NV_texture_rectangle and ATI GPUS only the EXT_texture_rectangle.

Unfortunately f you want to use pixel shaders, then only ARB_texture_rectangle defines functions to access such texture from shader.

“If you want compatibility with GeForce FX you will need to use EXT_texture_rectangle or NV_texture_rectangle. They’re identical extensions, but NVIDIA GPUS only report the NV_texture_rectangle and ATI GPUS only the EXT_texture_rectangle.”

Not true, NVIDIA reports ARB_texture_rectangle and NV_texture_rectangle.

You should use the ARB version.

And does the ARB_texture_rectangle really work on ATi cards? I don’t have any ATi card here atm so I cant test it, however from what I tried in the past and from what I have heard they are still not supported.

Hi Relic. Thanks for a little correction :slight_smile:

GeForce FX:
ARB_texture_rectangle
NV_texture_rectangle
GeForce 6:
ARB_texture_rectangle
NV_texture_rectangle
ARB_texture_non_power_of_two
Radeon:
EXT_texture_rectangle
(ARB_texture_non_power_of_two)

So the only option for rectangular textures compatible with GeForce FX and Radeon is using NV / EXT_texture_rectangle.

If you want to use rectangular texture + shaders then it could be NPOT on all GPU’s except for GeForce FX - in this case the ARB_texture_rectangle must be used.

All the texture_rectangle extensions have identical tokens, so it doesn’t matter which one is used - it’s just a question which one to look for in extensions string and if you can access rectangular texture from shader.

you can have a texture outside the range {0.0,1
.}
but if you want draw something with and have this value you need use a frame buffer object define with Gl_RGB32F_ARB in internal format.

I use this to do scalar product with big volumetric data and it s fine …

after drawing in your frame buffer you can call the glGetTexImage

but if you call this commande after awingg in the commun Buffer this is on RGBA8 and it clamp your value between {0.0,1.0}

sorry for my english

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.