Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: 32bit colours=fast! 16bit colours=very slow??

  1. #1
    Junior Member Newbie
    Join Date
    Jul 2003
    Location
    Melbourne, Australia.
    Posts
    14

    32bit colours=fast! 16bit colours=very slow??

    Howdy

    I've got a prob. I've been reading thru past posts that are similar but I can't work out a solution.

    I'm testing an OpenGL app on a box with a GF4 440 card. The problem I'm having is that if I run it with a 32bit colour depth video mode it runs fine. But if I run it with a 16bit colour depth it slows to about 1fps! I thought 16bit would be faster than 32bit?

    Is there a common reason for this? I'm not doing anything special. It can obviously handle what I'm drawing because its fine with 32. I've tried loading the textures with a GL_RGB5 flag to force it, but it made no diff.

    Any suggestions would be very much appreciated.

    Thanks!

    Rob.

  2. #2
    Junior Member Newbie
    Join Date
    Sep 2002
    Posts
    4

    Re: 32bit colours=fast! 16bit colours=very slow??

    32bits is faster than 16bits on nvidia cards.

  3. #3
    Junior Member Regular Contributor
    Join Date
    Jan 2001
    Location
    Shanghai, China
    Posts
    128

    Re: 32bit colours=fast! 16bit colours=very slow??

    I remember nvidia cards are optimized for 32bit texture. maybe the app slows down because you force it to use an unoptimized internal format of 16bit. but 1fps seems to be too slow...maybe someone more familiar with nvidia cards can answer that question.
    End of transmission...

  4. #4
    Junior Member Newbie
    Join Date
    Jul 2003
    Location
    Melbourne, Australia.
    Posts
    14

    Re: 32bit colours=fast! 16bit colours=very slow??

    Ahh I see. I didn't know that. So there's nothing that can be done?

    It's such a massive drop in speed that it seems unnatural.

    Thanks!

    Rob.

  5. #5
    Intern Contributor
    Join Date
    Dec 2001
    Location
    Berlin, Germany
    Posts
    63

    Re: 32bit colours=fast! 16bit colours=very slow??

    Probably your application is using the stencil buffer, which is supported in hardware only for 32 bit. In 16 bit, the graphics driver falls back to software rendering.

  6. #6
    Junior Member Newbie
    Join Date
    Jul 2003
    Location
    Melbourne, Australia.
    Posts
    14

    Re: 32bit colours=fast! 16bit colours=very slow??

    You're right, I am using the stencil buffer. But I already tried turning that off. It does speed things up, but it still doesn't run close to as fast as it does in 32bit does with or without the stencil buffer.

    So I'm still surprised by the huge drop. It must be card related tho. I've tried it with an ATI card and it's fine under both.

    Rob.

  7. #7
    Senior Member OpenGL Guru
    Join Date
    Mar 2001
    Posts
    3,575

    Re: 32bit colours=fast! 16bit colours=very slow??

    You should, also, use a 16-bit depth buffer when using a 16-bit color buffer.

  8. #8
    Junior Member Newbie
    Join Date
    May 2004
    Location
    Florida
    Posts
    7

    Re: 32bit colours=fast! 16bit colours=very slow??

    ATI cards don't support 16-bit depth buffers, BTW...

    You may very well be running a non-ICD pixel format.

    There's only a limited number of 16-bit pixel formats available on my Radeon 9800 and most of them aren't ICD acellerated... If you don't check the actual pixel format for "generic" or "generic accellerated" you could be running in a software or MCD format.

    You might even want to check if you're even getting a 16-bit format on the ATI card. The pixel format descriptor you pass to ChoosePixelFormat is merely a hint, the driver's free to return anything it deems "closest" to what you asked for

    Case and point, requesting a 16-bit or 32-bit depth buffer on ATI hardware will ALWAYS return a 24-bit buffer. If you run through ALL of the pixel formats reported by the driver, you'll notice they're ALL based on a 24-bit depth buffer. The driver has no choice but to return a 24-bit depth buffer even if you ask for 16-bit or 32-bit.

    This is especially the case if the desktop is set to 32-bit color and you request a 16-bit color pixel format.

    Code :
        /* Set the display to 16-bit */
        DEVMODE dm;
        ZeroMemory (&dm, sizeof (DEVMODE));
     
        dm.dmSize       = sizeof (DEVMODE);
        dm.dmFields     = DM_BITSPERPEL;
        dm.dmBitsPerPel = 16;
     
        ChangeDisplaySettings (&dm, CDS_FULLSCREEN); /* FULLSCREEN is misleading, this really just means to
                                                        restore the original display settings when the application exits. */
    Code :
        /* pfd's the Described Pixel Format... */
     
        /* Check for ICD pixel format */
        if (  (pfd.dwFlags & PFD_GENERIC_FORMAT)      &&
            ! (pfd.dwFlags & PFD_GENERIC_ACCELERATED) ) {
          /* This is an ICD format */
        } else {
          /* This is an MCD or reference format expect poor performance */
        }

  9. #9
    Junior Member Newbie
    Join Date
    Jul 2003
    Location
    Melbourne, Australia.
    Posts
    14

    Re: 32bit colours=fast! 16bit colours=very slow??

    Thanks for the info. I am actually checking to see if its accelerated or not. But it's the nVidia card that's having the problem. The ATI runs fine.

    However, even in the super slow 16bit mode on the nVidia, it's still reporting that it's accelerated. I'm still surprised that nVidia cards just don't do 16bit fast. It doesn't seem right when 32bit is fine.

    Rob.

  10. #10
    Junior Member Newbie
    Join Date
    Jul 2003
    Location
    Melbourne, Australia.
    Posts
    14

    Re: 32bit colours=fast! 16bit colours=very slow??

    I've just been doing some fiddling with some of my other programs, and I think I was right. There has to be a reason for it in the program because others I've written work fine under 16bit. So it can't just be the nVidia card (I think someone suggested that they're generally slow at 16bit).

    So there must be something I'm doing wrong creating the window or something?

    Rob.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •