Catalyst 3.8 bug - glDrawPixels

I was doing some testing and I noticed this weird behavior with glDrawElements.

render()
{
I set modelview matrix to identity
I set projection matrix to ortho.
I call glRasterPos
I call DrawPixels.
}

It renders fine when all the pixels fall inside the window, but when I resize the window and make it smaller, instead of the pixels that fall outside the viewport getting clipped, the whole DrawPixels rectangle is scaled so that it fits in the window.

Have you tried the new 3.9 hotfixed drivers ? They have a updated opengl version in those.

That’s weird. I somewhat assume they’ve internally switched to texture rectangles (which are worth considering as a generic replacement for glDrawPixels btw).

Maybe you can work around the problem by calling glPixelZoom(1,1) after glViewPort (reshape handler?). I didn’t test it, but it could work …

I havent tried 3.9 cause things are going fine with 3.8

The Dawn demo doesnt run anymore with the hacked opengl32.dll
Something about NV_point_sprite missing.

I assumed DrawPixels is using a textured quad. It’s hard to tell cause each call to DrawElements would cause a texture update which would still be slow.

DrawPixels() cannot use a textured quad, because DrawPixels() generates fragments that pass through the fragment pipeline (if I recall correctly).

Anyway, if your image shrinks when you resize the window, isn’t that because your viewport shrinks, and all the GL matrix operations are relative to the viewport?

Originally posted by jwatte:

Anyway, if your image shrinks when you resize the window, isn’t that because your viewport shrinks, and all the GL matrix operations are relative to the viewport?

Pixel transfers are not affected by any matrices (except the color matrix). So the image should stay the same if the viewport and/or window is changed. The raster position is affected by the matrices, but that only affects the position of the image, not it’s shape.

Originally posted by jwatte:
DrawPixels() cannot use a textured quad, because DrawPixels() generates fragments that pass through the fragment pipeline (if I recall correctly).

Yes, according to the schema of the pipeline, but that’s SGI’s reference.

I beleive that you are free to implement the way you want. The important thing is the end result (the buffers, and stacks, and so on)

With a texture quad, with NEAREST filtering and the right tex coordinates,
it should be possible to reproduce DrawPixels perfectly.

If NVidia says they render lines with 2 triangles, then I figure anything is possible.

Me too

Originally posted by jwatte:
DrawPixels() cannot use a textured quad, because DrawPixels() generates fragments that pass through the fragment pipeline (if I recall correctly).
It works, under certain constraints
You need at least one unused texture unit, and enough free texture memory to buffer the incoming pixel data.
… umm … nothing else AFAICS.

Then you just replace all references to primary color w the texture sampler (especially easy if you have a texture crossbar), render a screen aligned quad w the proper tex coords, voila, DrawPixels fully emulated.

Anecdotal evidence: a certain rather smallish IHV told me something very interesting when I asked about why DrawPixels starts to fail when I exceed 1024 pixels width and/or height