I wanted to see if anyone else has seen this behavior with OpenGL under Windows.
I have been reading RGBA pixels from the back buffer using the glReadPixels function. When my app runs on a machine that uses the generic GDI OpenGL implementation, the alpha value always comes back as 255.
However, when I run the app on a machine that does not use the generic GDI OpenGL implmentation, the same ReadPixels call comes back with the correct alpha value.
This is what is confusing…I know that alpha blending works for my app regardless of what implementation of OpenGL the machine has (generice GDI or other). How is alpha blending working on machines that use the generic GDI implementation?