OpenGL error 1281 caused by glTexSubImage2D

I’ve got a problem with glTexSubImage2D:
First I used SDL for Graphics output, but now I use OpenGL (with SDL creating a context).
But I still wanted to use for example SDL_ttf.
But when I use that code:


// This function was already OK when I used SDL
SDL_Surface *pFontSurface = Font.RenderTextSurface(sText, TextColor, RenderMode, ShadedBGColor);

// This function is used because SDL_ttf returns rendered font in ARGB
// This function cannot be the mistake because - when I remove it - it still doesn't work (apart from that it is in the wrong format)
pFontSurface = ConvertSDLSurface->ARGB_To_RGBA(pFontSurface);

glEnable(GL_TEXTURE_2D);

// m_pImage->Texture is the texture the font image should be rendered to (GLuint)
glBindTexture(GL_TEXTURE_2D, m_pImage->Texture);

glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, pFontSurface->w, pFontSurface->h, GL_RGBA, GL_UNSIGNED_BYTE, pFontSurface->pixels);

cerr << "OpenGL error: " << glGetError() << endl;

SDL_FreeSurface(pFontSurface);

glDisable(GL_TEXTURE_2D);

cerr prints “OpenGL error: 1281” on the console.

The mistake can’t be a “glPixelStorei == 4” problem because I create the texture like this:


SDL_Surface *pSDLSurface = SDL_CreateRGBSurface(SDL_SWSURFACE | SDL_SRCALPHA,
	MyMath->power_of_two(GetTextureW()),
	MyMath->power_of_two(GetTextureH()),
	32,
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
	0xff000000,
	0x00ff0000,
	0x0000ff00,
	0x000000ff
#else
	0x000000ff,
	0x0000ff00,
	0x00ff0000,
	0xff000000
#endif
);
SDL_FillRect(pSDLSurface, NULL, CColor(0, 0, 0, 0).GetColorRGBA());
SDL_SetAlpha(pSDLSurface, SDL_SRCALPHA, 0);
glGenTextures(1, &m_GraphicsImage.Texture);
glBindTexture(GL_TEXTURE_2D, m_GraphicsImage.Texture);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_PACK_ALIGNMENT, 1);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D,
             0,
             GL_RGBA,
             MyMath->power_of_two(GetTextureW()),
             MyMath->power_of_two(GetTextureH()),
             0,
             GL_RGBA,
             GL_UNSIGNED_BYTE,
             pSDLSurface->pixels);

Information about my Laptop:

  • OS: Ubuntu Linux 10.10
  • OpenGL version used with SDL (SDL 1.2): OpenGL 2.1
  • CPU: Intel Core 2 Duo
  • GPU: ATI Mobility Radeon HD 4650
  • GPU driver: radeon (open source)
  • Laptop bought: end of 2009

additive information:


printf( "Vendor     : %s
", glGetString( GL_VENDOR ) );
printf( "Renderer   : %s
", glGetString( GL_RENDERER ) );
printf( "Version    : %s
", glGetString( GL_VERSION ) );

Vendor : Advanced Micro Devices, Inc.
Renderer : Mesa DRI R600 (RV730 9480) 20090101 x86/MMX/SSE2 TCL DRI2
Version : 2.1 Mesa 7.9-devel

So the question: why does glTexSubImage2D cause the OpenGL error 1281?

Thanks in advance!


glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, pFontSurface->w, pFontSurface->h, GL_RGBA, GL_UNSIGNED_BYTE, pFontSurface->pixels);

glTexImage2D(GL_TEXTURE_2D,
             0,
             GL_RGBA,
             MyMath->power_of_two(GetTextureW()),
             MyMath->power_of_two(GetTextureH()),
             0,
             GL_RGBA,
             GL_UNSIGNED_BYTE,
             pSDLSurface->pixels);

Are you sure ‘pFontSurface->w’ and ‘MyMath->power_of_two(GetTextureW())’ match (have you seen them logged)?
You are getting GL_INVALID_VALUE which doesnt occur in whole lot of cases with this function:
http://www.opengl.org/sdk/docs/man/xhtml/glTexSubImage2D.xml

kyle_: Thank you for your answer!
You were right: ‘pFontSurface->w’ and ‘MyMath->power_of_two(GetTextureW())’ didn’t match.
GetTextureW() returned the Texture coordinates and not the pixel size.