In the past on NVidia GL drivers (Linux and Windows), I have used DXT5 as the internal format of textures (2D and 2D Array), and both 1) subloaded pre-compressed DXT5 to these textures as well as (many years ago) 2) subloaded uncompressed RGBA8 to these textures and let the driver compress this data on-the-fly. Both paths worked fine, and rendering with these textures rendered properly. The latter path (subloading uncompressed texels) was of course very slow because of the long delay in the driver doing the compression on-the-fly; you’d not really supposed to upload texel data that way if you want fast performance and good quality compression.
Also, there are a number of Linux projects other there (open source and commercial) that would be continually verifying that DXT5 support in NVidia GL drivers is solid. Beyond that, NVidia has a pretty extensive test suite that the run their drivers through, and I’d bet DXT5 is tested.
Finally, NVidia uses the same driver core across both Linux and Windows, so there’s less likely to be platform-specific regressions in their OpenGL support for specific features. That doesn’t mean there can’t be a bug though, but historically NVidia OpenGL drivers are solid, and it’s very rare that I’ll find a bug in their drivers (Linux or Windows).
The routine I am using works flawlessly on Windows, but on linux the same code works only with DXT1 and DXT3, for DXT5 (BC3) I am getting purely black colors on all color channels with only the alpha channel working as expected.
That’s interesting.
Are both paths running 64-bit OSs on 64-bit CPUs with 64-bit executables? Or is there a difference between the platforms here?
Is there any differerence in the platform-specific code w.r.t. what GL calls are made? For instance, is glPixelStorei ( GL_UNPACK_ALIGNMENT, 1 ) being called for both paths?
Are you subloading to the DXT5 textures the same way on both platforms?
Is the data you are subloading to the textures exactly the same across both platforms?
What about the context creation (which is going to be different between the platforms if you’re using GLX and WGL). Are you creating a context with exactly the same capabilities (same GL version, both are either compatibility, or they’re both core, etc.)?
I am having difficulties finding any piece of software demonstrating the DXT5 functionality on Linux to verify this.
You could cook one up pretty quickly that’s based on GLUT. In fact, I was going to suggest that you whittle down a failure case to a short, stand-alone test program that you could repro this problem with, and then you could post it here. With that, folks could help point out any possible errors, make suggestions, and even try it locally to give you feedback both on Linux and Windows.
Here’s a shell test program you can plug your failure logic into:
#include <GL/gl.h>
#include <GL/glut.h>
//============================================================================
void Setup()
{
glClearColor( 0.27, 0.50, 0.70, 1.0 );
}
//============================================================================
void reshape(GLsizei w, GLsizei h)
{
h = (h == 0 ? 1 : h );
glViewport(0, 0, w, h);
glutPostRedisplay();
}
//============================================================================
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glutSwapBuffers();
}
//============================================================================
void keyboard( unsigned char key, int x, int y )
{
// Key Bindings
switch( key )
{
case 27 : exit(0); break;
}
glutPostRedisplay();
}
//============================================================================
int main(int argc, char *argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowPosition(0, 0);
glutInitWindowSize(500,500);
glutCreateWindow("Texture Test");
glutReshapeFunc(reshape);
glutDisplayFunc(display);
glutKeyboardFunc(keyboard);
// Not needed
//glutReshapeWindow(500,500);
Setup();
glutMainLoop();
}