glDrawPixels affected from texturing?

I observed that, on some OpenGL drivers when texturing is enabled pixel level draw functions (glDrawPixels, glBitmap and probably the others) are textured with the active texture.

The below code draws a blue drawpixels rectangle, regardless of the pixels parameter glDrawPixels takes.

// set the first pixel color to blue

texture[0][0][0]=0;
texture[0][0][1]=0;
texture[0][0][2]=255;

glBindTexture(GL_TEXTURE_2D,1);
glEnable(GL_TEXTURE_2D);
glTexImage2D(GL_TEXTURE_2D,0,3,256,256,0,
GL_RGB,GL_UNSIGNED_BYTE,texture);
glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE, GL_DECAL);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_NEAREST);

glDrawPixels(256,256,GL_RGB,GL_UNSIGNED_BYTE,bitmap);

However, this is not a common behaviour.
On Windows platform,
Intel 965 draws both glBitmap and glDrawPixels as textured,
MESA (soft renderer) just draws glBitmap as textured,
ATI 4850 draws none of them as textured

I thought only primitives are textured in OpenGL. However, I could not find any documentation about this behaviour. It does not seem to be a part of a functionality with purpose maybe a bug.

Is it possible that those pixel level functions implicitly using textures to transfer data? or
Is there any idea for the usage of the textured pixels on the field?

do you use glColor() function somewhere? this modifies any color. before drawing your objects try setting it to glColor3f(1.0f,1.0f,1.0f);

See the DrawPixels man page, where it says:

“These pixel fragments are then treated just like the fragments generated by rasterizing points, lines, or polygons. Texture mapping, fog, and all the fragment operations are applied before the fragments are written to the frame buffer.”

NK47,
I use glColor. However I also GL_DECAL tex. env. mode. Therefore, texture color is not affected from the current color.

arrekusu,
Yeah I know it is written like that in Bluebook.
However it still doesn’t make sense. Because if pixels drawn by glDrawPixels can be textured then how can you map a texture with the correct tex. coordinates. I mean you cannot change the tex. coordinates inside the glDrawPixels call (cannot map edge coordinates of a 2D texture).

Therefore, I just want to understand if there is a practical use for the textured glDrawPixels.

Anyone has a suggestion for the use of the textured glDrawPixels in practice?

OpenGL is a state machine so the last texture coordinate is used, no interpolation.

Maybe the pixels of glDrawPixels are used as the texture. In this case, no texture coordinates are needed.

In practice, no, I don’t think this is very useful at all.

I can see people using fog, and raster ops like depth test with drawpixel’d labels in 3D graphing applications.

The state machine allows you to leave shaders and texturing enabled, so you could use a constant texel value to i.e. tint, or maybe animate a glow over the drawpixel data, but that’s stretching it-- I’ve never seen an app use textured drawpixels on purpose :wink:

And you can certainly do this much more efficiently with regular textured quads.

I figured out that glDrawPixels is implemented in our driver (an embedded system ATI driver) by drawing a set of point vertices on the given coordinates. Therefore, this behaviour makes sense even if it does not have an intended usage. This is a side effect.

It should apply texturing to glDrawPixels for sure. I’m not so sure about glBitmap. Basically, glDrawPixels takes the pixel data and generates a fragment for each incoming pixel. The color of the pixel is the gl_Color for the fragment.