Error on compressed 1D texture

Hello,everybody!
When I generated compressed texture by “glTexImage1D”, an error appeared. Does my process of 1D texture compressing lacks of some other processing? Or my card(8800GT)doesn’t support 1D texture compressing? My codes are as follows:


#define WidthP 64
GLubyte image1D_rgba_1[WidthP][4];
GLubyte image1D_rgba_com_1[WidthP][4];
GLint compressed, compressedSize;
int i, c;
for (i = 0; i < WidthP; i++) {
c = (((i&0x8)==0))*255;
image1D_rgba_1[i][0] = (GLubyte) c;
image1D_rgba_1[i][1] = (GLubyte) c;
image1D_rgba_1[i][2] = (GLubyte) c;
image1D_rgba_1[i][3] = (GLubyte) 255;
}
//generating compressed 1D texutre 
glTexImage1D(GL_TEXTURE_1D, 0, GL_COMPRESSED_RGBA_S3TC_DXT5_EXT, WidthP, 0, GL_RGBA, GL_UNSIGNED_BYTE, image1D_rgba_1);

//If successful, return TRUE, but it returens FALSE here
glGetTexLevelParameteriv(GL_TEXTURE_1D, 0, GL_TEXTURE_COMPRESSED, &compressed);
if (compressed == GL_TRUE) {

//Get the texture that has been compressed.
glGetCompressedTexImage(GL_TEXTURE_1D, 0, image1D_rgba_com_1);

//Get the size of compressed texture
glGetTexLevelParameteriv(GL_TEXTURE_1D, 0, GL_TEXTURE_COMPRESSED_IMAGE_SIZE, &compressedSize);
}else{
printf("Compressing of texture is fail. ");
}

//load the compressed texture 
glCompressedTexImage1D(GL_TEXTURE_1D, 0, GL_COMPRESSED_RGBA_S3TC_DXT5_EXT, WidthP, 0, compressedSize, image1D_rgba_com_1);
 
 

Any help would be appreciated, thanks in advance!

I am not sure about what the 8800GT does and doesn’t support, but…

Is the source texture already compressed?

And / or is it’s format one which is accepted on your platform?

I am not aware of any on-card 1D texture compression algorithm. All such algorithms I know of are for 2D textures and work for blocks of pixels.

What Zengar said (these algorithms work on blocks of 4x4 texels). Instead you should just create a 2D texture with 4 texels in height and it should work.

You’re explicitly specifying GL_COMPRESSED_RGBA_S3TC_DXT5_EXT, which does not support 1D textures. Even if your card supports some compression system for 1D textures your code won’t work.

If your card and driver do support some compression system for 1D textures specifying the generic compressed format COMPRESSED_RGBA_ARB instead could work.

Philipp

scratt,
The source texture is already compressed, you could see my codes. And it’s format one is accepted on my platform.
PkK,
I’ve been tried to specify format COMPRESSED_RGBA_ARB ,but
it didn’t work.Is there any other problem in my codes?
Zenger and Nicolai de Haan Brogger,
I am coding some hardware-test program, to test almost every API of OpenGL, so I have to test 1D/2D/3D compressed texture even if they didn’t work.Is there any official docs about texture compression algorithms which work on blocks of 4X4 texels? Could you give me some detailed information about that? Thanks in advance!

From what I have read in the docs about this pre-compressed textures are sometimes a problem. I am by no means an expert on compressed textures, but have recently been reading the docs in quite some depth as I implement it in my engine.

If you can, give it a try with the same format, but not compressed.

Obviously also pay heed to what others have said above about 1D textures, and look at the option of making a 2D texture which is effectively the 1D texture repeated on more horizontal lines.

Thank you for explanation! I’ve tried 1D/2D/3D textures and 1D/2D/3D compressed textures with the same format.
1D/2D/3D textures without compressing don’t have any problems on my engine.
2D compressed textures has no problems as well. 3D compressed textures has obvious image quality loss(I think it is quite normal). 1D compressed textures returns FALSE as I stated above and doesn’t work at all(renders nothing).As the texture data I setted, the expectation effect of 1D textures looks like some pieces of white&black stripes.
You’ve said that you have recently been reading the docs about this, could you recommend me some docs about compressed texture?
Just like others said, something about “algorithms work on 4x4 texels-blocks” ?

This is where I started…
It actually implies fairly early on that when using glTexImageXX that you should pass in uncompressed images.

http://developer.nvidia.com/object/texture_compression_OpenGL.html

Here is the OpenGL spec…
http://www.opengl.org/registry/specs/EXT/texture_compression_s3tc.txt

COMPRESSED_RGBA_S3TC_DXT1_EXT:  Each 4x4 block of texels consists of 64
bits of RGB image data and minimal alpha information.  The RGB components
of a texel are extracted in the same way as COMPRESSED_RGB_S3TC_DXT1_EXT.

Looking at your code above, you are actually loading the compressed texture at one point using glCompressedTexImage1D, and also earlier used GL_COMPRESSED_RGBA_S3TC_DXT5_EXT in glTexImage1D.

When I have done this I have simply used the generic forms GL_COMPRESSED_RGBA etc. and the driver has decided what to use itself, normally showing in the profiler as GL_COMPRESSED_RGBA_S3TC_DXT5_EXT or something similar.

Have you tried it that way?

Thanks very much! From the OpenGL spec, I’ve seen that
“The S3TC texture compression algorithm supports only 2D images without borders. CompressedTexImage1DARB and CompressedTexImage3DARB produce an INVALID_ENUM error if <internalformat> is an S3TC format”.
But I’ve tried basic forms “GL_COMPRESSED_RGBA”, it did not work
either.
You said that " when using glTexImageXX that you should pass in uncompressed images", I don’t know what it means very clearly,
when I called glTexImage1D as stated above, I passed in “image1D_rgba_1” as texture data, I think it was “uncompressed images”,Do you mean other things?
I think my code is simple enough,but is it too simple…
Or may it have some relationship with texture settings just like GL_REPEAT/GL_NEAREST(glTexParameter)?

My understanding, and I may be wrong here as I have not actually tested this both ways round for failures, is as follows…

If you have a normal uncompressed image you use glTexImagexx to load it, and specify a format that is either GL_COMPRESSED_XX or GL_XX. The driver handles everything for you. I know this works as it’s what I do, and when I profile I can see that the image is stored using a compressed format.

If you have pre-compressed an image according to one of the accepted S3TC formats, then you use glCompressedTexImageXX, which has slightly different arguments and expects a properly compressed image. I have tried to load a normal image this way, when I didn’t fully understand the command, and simply got an empty NULL texture. My assumption is if I had a pre-compressed image it would load properly this way if it is in the right format.

From the spec it says the following…

Undefined results, including abnormal program termination, are generated if data is not encoded in a manner consistent with the extension specification defining the internal compression format.

…which seems to bear out what I am saying.

I have read that one way to get compressed images is to use glTexImageXX to make them, and then read them back and save them… That also makes sense as it would guarantee that the format you get back is one supported by your hardware.

Nvidia and ATI both have tools to produce high quality compressed textures. Your load times will certainly be faster and you will likely get better visual results with them instead of compressing with the glTexImageXX calls.

Nvidia Texture Tools 2 (a library to include) : http://developer.nvidia.com/object/texture_tools.html
ATI Compressonator (full gui) :
http://developer.amd.com/gpu/compressonator/Pages/default.aspx

I didn’t know about those bertgp. Thanks for linking to them.

Although I have to say I am reasonably happy so far with what the drivers produce. But better is always more desirable! :slight_smile: