PDA

View Full Version : can't use gluScaleImage with GL_UNSIGNED_SHORT_5_6_5



jean-bobby
07-06-2004, 08:39 AM
Hello,

I use gluScaleImage function for loading a texture. It works with 32bpp but with 16bpp my texture is totally black. The buffer for the scaled image is well allocated (by SDL) ; moreover I fill it with red color and I think I give the correct parameters to the function :

gluScaleImage(
GL_RGB,
image->w,
image->h,
GL_UNSIGNED_SHORT_5_6_5,
image->pixels,
rescaledImage->w,
rescaledImage->h,
GL_UNSIGNED_SHORT_5_6_5,
rescaledImage->pixels );
...

glTexImage2D(GL_TEXTURE_2D,
0,
3,
rescaledImage->w, rescaledImage->h,
0,
GL_RGB,
GL_UNSIGNED_SHORT_5_6_5,
rescaledImage->pixels
);There is no gl errors and I can display my texture if I don't scale its image source.
Have I forgotten to do something for 16 bits mode?

Thanks in advance

CrazyButcher
07-06-2004, 12:07 PM
I might be wrong but to my knowledge that glu scale is just a pure C function and nothing is done in GL, so no GL error
And as glu is quite old there werent such packed formats I think
according to reference the typein/typeout can only be
GL_UNSIGNED_BYTE, GL_BYTE, GL_BITMAP, GL_UNSIGNED_SHORT, GL_SHORT, GL_UNSIGNED_INT, GL_INT, or GL_FLOAT.

jean-bobby
07-06-2004, 10:11 PM
Thank you for your lights CrazyButcher.
So as there is no GL error returned by glGetError after glu function calls, I've tested the returned value of gluScale and I've got the error "enum not valid". So GL_UNSIGNED_SHORT_5_6_5 is not a correct parameter for the function.
Although, according to this reference page (http://pyopengl.sourceforge.net/documentation/manual/gluScaleImage.3G.xml) , it should be. Too bad...
So I will scale a 32 bits image and then use SDL to convert it to a 16 bits image... that should work

CrazyButcher
07-06-2004, 10:54 PM
well might as well be my mistake, as said I am not so sure about it.

Ariel
07-07-2004, 05:22 AM
When i use the stencil buffer in 16bits depth i have 9 fps but i am only drawing 600 polygons