PDA

View Full Version : glEnable(GL_TEXTURE_2D) and INVALID_ENUM



lamovoz
04-23-2009, 07:37 AM
Hi.

I try to create a simple OpenGL 3.0 application. I have an ATI hd3870 card and everything goes ok, but, I get a strange error on nVidia cards with lattest drivers 182.50. After some tests I locate a place there glGetError() returns me a 1280 error (INVALID_ENUM), this occurs right after glEnable(GL_TEXTURE_2D) in a very beginng of a program. The question is - what is mean?

Ah, and problem is - on nVidia card my object in scene (just cube) dosen't have any texture. But on ATI works perfect, withou any errors.

lamovoz
04-23-2009, 08:22 AM
Somne new facts from frontline :)

I comment out glEnable(GL_TEXTURE_2D), as one man say me it is not required. And I found second place there error 1280 is fired on nVidia, it is line:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_BGR, GL_UNSIGNED_BYTE, buffer);

And just for test I remove OpenGL 3.0 forward context decloration code and edit my shaders for GLSL 1.2 and guess what? There is no any errors and application runs fine on both ATI and nVidia. x_x

And I want to ask, what is wrong with this glTexImage2D in respect to OpenGL 3.0?

dletozeun
04-23-2009, 08:55 AM
Maybe it is in the way you are using it.
Could you post the code relative to your texture setup?

lamovoz
04-23-2009, 09:04 AM
Hm, just change GL_BGR to GL_RGB in glTexImage2D and application with OpenGL 3.0 context now run normal on nVidia too. But GL_BGR works on ATI. Strange issue :(

lamovoz
04-23-2009, 09:06 AM
Maybe it is in the way you are using it.
Could you post the code relative to your texture setup?
Yes, the old code is (I remove glGetError() checks from it):


GLuint create_texture_from_tga(const char *file_name)
{
struct tga_header header;
int32_t size, bpp, pixels, i, j;
uint8_t *buffer, rgba[4], temp, pack;
FILE *image;
GLuint texture;

image = fopen(file_name, "rb");
if (NULL == image)
{
LOG_ERROR("Cant open file %s\n", file_name);
return 0;
}

if (fread(&header, 1, sizeof(header), image) != sizeof(header))
{
LOG_ERROR("Cant read TGA header from file %s\n", file_name);
fclose(image);
return 0;
}

if (header.bitperpel!=24 && header.bitperpel!=32 && header.datatype!=2 && header.datatype!=10)
{
LOG_ERROR("Wrong TGA format %s\n", file_name);
fclose(image);
return 0;
}

if (header.idlength)
fseek(image, header.idlength, SEEK_CUR);

size = (int32_t)header.width * (int32_t)header.height;
bpp = header.bitperpel / 8;
buffer = (uint8_t*)malloc(size * bpp);

if (header.datatype == 2)
{
fread(buffer, size, bpp, image);
} else if (header.datatype == 10)
{
for (i=0, j=0; i<size;)
{
fread(&amp;temp, 1, 1, image);
pack = temp & 128;
pixels = (int32_t)(temp & 127);
if (pack)
{
fread(rgba, 1, bpp, image);
while (pixels-- >= 0)
{
memcpy(buffer + j, rgba, bpp);
j += bpp;
++i;
}
} else
{
fread(buffer + j, pixels + 1, bpp, image);
j += (pixels + 1) * bpp;
i += pixels + 1;
}
}
}

fclose(image);

glGenTextures(1, &amp;texture);
glBindTexture(GL_TEXTURE_2D, texture);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);

glTexImage2D(GL_TEXTURE_2D, 0, bpp == 3 ? GL_RGB8 : GL_RGBA8, header.width,
header.height, 0, bpp == 3 ? GL_BGR : GL_BGRA, GL_UNSIGNED_BYTE, buffer);
// as I say eraly, I change GL_BGR to GL_RGB and this resolve the problem

free(buffer);

return texture;
}

lamovoz
04-23-2009, 09:25 AM
Any ideas?

dletozeun
04-23-2009, 12:33 PM
I looked into your code and I have seen nothing wrong.
But there is something I don't understand, is the loaded data pixel format, BGR or RGB ?? Because, switching from one to another you should seen weird colors, don't you?

Jan
04-23-2009, 01:25 PM
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=256171

The early nVidia GL 3.0 drivers are a bit buggy, probably not your fault.

Jan.

lamovoz
04-23-2009, 03:34 PM
I looked into your code and I have seen nothing wrong.
But there is something I don't understand, is the loaded data pixel format, BGR or RGB ?? Because, switching from one to another you should seen weird colors, don't you?


Yes, I do. I add swapping from BGR to RGB to the image load code, and this works for me.


http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=256171

The early nVidia GL 3.0 drivers are a bit buggy, probably not your fault.

Jan.

I think so.

Anyway, thanks for replies! :)

dletozeun
04-23-2009, 04:06 PM
Yes, I do. I add swapping from BGR to RGB to the image load code, and this works for me.


Thanks, I understand better, it must be a driver bug as Jan said.