S3TC DXT5 with nVidia drivers

Hello,

could anyone verify if the S3TC DXT5 is working on Linux with Nvidia drivers? The routine I am using works flawlessly on Windows, but on linux the same code works only with DXT1 and DXT3, for DXT5 (BC3) I am getting purely black colors on all color channels with only the alpha channel working as expected.

I am having difficulties finding any piece of software demonstrating the DXT5 functionality on Linux to verify this. I found one set of tutorials where the DXT5 is used but the displayed result is the same as my code, so I am inclined to believe that this is a driver issue. I am on Ubuntu, my GPU is nVidia GeForce 1060 and drivers are v384. I have also tried to swich to Nouveau drivers but I didn’t get any video at all, so I had to switch back.

Thank you very much in advance.

In the past on NVidia GL drivers (Linux and Windows), I have used DXT5 as the internal format of textures (2D and 2D Array), and both 1) subloaded pre-compressed DXT5 to these textures as well as (many years ago) 2) subloaded uncompressed RGBA8 to these textures and let the driver compress this data on-the-fly. Both paths worked fine, and rendering with these textures rendered properly. The latter path (subloading uncompressed texels) was of course very slow because of the long delay in the driver doing the compression on-the-fly; you’d not really supposed to upload texel data that way if you want fast performance and good quality compression.

Also, there are a number of Linux projects other there (open source and commercial) that would be continually verifying that DXT5 support in NVidia GL drivers is solid. Beyond that, NVidia has a pretty extensive test suite that the run their drivers through, and I’d bet DXT5 is tested.

Finally, NVidia uses the same driver core across both Linux and Windows, so there’s less likely to be platform-specific regressions in their OpenGL support for specific features. That doesn’t mean there can’t be a bug though, but historically NVidia OpenGL drivers are solid, and it’s very rare that I’ll find a bug in their drivers (Linux or Windows).

The routine I am using works flawlessly on Windows, but on linux the same code works only with DXT1 and DXT3, for DXT5 (BC3) I am getting purely black colors on all color channels with only the alpha channel working as expected.

That’s interesting.

Are both paths running 64-bit OSs on 64-bit CPUs with 64-bit executables? Or is there a difference between the platforms here?

Is there any differerence in the platform-specific code w.r.t. what GL calls are made? For instance, is glPixelStorei ( GL_UNPACK_ALIGNMENT, 1 ) being called for both paths?

Are you subloading to the DXT5 textures the same way on both platforms?

Is the data you are subloading to the textures exactly the same across both platforms?

What about the context creation (which is going to be different between the platforms if you’re using GLX and WGL). Are you creating a context with exactly the same capabilities (same GL version, both are either compatibility, or they’re both core, etc.)?

I am having difficulties finding any piece of software demonstrating the DXT5 functionality on Linux to verify this.

You could cook one up pretty quickly that’s based on GLUT. In fact, I was going to suggest that you whittle down a failure case to a short, stand-alone test program that you could repro this problem with, and then you could post it here. With that, folks could help point out any possible errors, make suggestions, and even try it locally to give you feedback both on Linux and Windows.

Here’s a shell test program you can plug your failure logic into:


#include <GL/gl.h>
#include <GL/glut.h>

//============================================================================

void Setup()
{
  glClearColor( 0.27, 0.50, 0.70, 1.0 );
}

//============================================================================

void reshape(GLsizei w, GLsizei h)
{
  h = (h == 0 ? 1 : h );

  glViewport(0, 0, w, h);

  glutPostRedisplay();
}

//============================================================================

void display(void)
{
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
  glutSwapBuffers();
}

//============================================================================

void keyboard( unsigned char key, int x, int y )
{
  // Key Bindings
  switch( key )
  {
    case 27 : exit(0);                    break;
  }
  glutPostRedisplay();
}

//============================================================================

int main(int argc, char *argv[])
{
  glutInit(&argc, argv);
  glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
  glutInitWindowPosition(0, 0);
  glutInitWindowSize(500,500);
  glutCreateWindow("Texture Test");

  glutReshapeFunc(reshape);
  glutDisplayFunc(display);
  glutKeyboardFunc(keyboard);

  // Not needed
  //glutReshapeWindow(500,500);

  Setup();

  glutMainLoop();
}

Thanks for the answer. I will try to isolate my code in the upcoming days. For now I think it would be faster like this - this code has the same problem:

https://github.com/opengl-tutorials/ogl/archive/master.zip (the web page this came from is Download )
it is the tutorial05_textured_cube

all that is needed to do is to follow these installation notes:

[ul]
[li] cd in ~/Projects/OpenGLTutorials/ and enter the following commands : [/li]> [li]mkdir build [/li]> [li]cd build [/li]> [li] cmake … [/li]> [li]A makefile has been created in the build/ directory. [/li]> [li]type “make all”. Every tutorial and dependency will be compiled. Each executable will also be copied back into ~/Projects/OpenGLTutorials/ . Hopefuly no error occurs. [/li]> [/ul]

and finally replace the dds file with one which uses the DXT5 compression.

Mine results:

DXT3:
[ATTACH=CONFIG]1645[/ATTACH]
DXT5:
[ATTACH=CONFIG]1646[/ATTACH]

OK, I was able to load and display a DXT5 successfully using an Ogre3D sample, so the driver bug is not a possibility anymore. Now for the tedious part of me trying to pinpoint the bug…

Fixed. It was a coincidence that for handling of the DDS files, I was using the same classes as the aforementioned tutorial http://www.opengl-tutorial.org/download/ Those classes contain a bug in the vertical image flipping for DXT5 (flip_dxt5_alpha), where a nonportable datatype unsigned long is used (4 bytes on Windows, 8 bytes on Linux). Using the uint32_t type fixed it. I think I was removing unsigned long in those classes on other places too some time ago, so beware.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.