Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 4 of 4

Thread: My own GL buffers

  1. #1
    Junior Member Regular Contributor
    Join Date
    Dec 2000
    Location
    montreal
    Posts
    105

    My own GL buffers

    This I think could be useful to some people, specially those seeking high quality graphics.

    How about telling GL what we want our buffers to be like. I'm thinking about the z-buffer specifically where 16 or 24 or 32 bit may not be enough. If there was function with which we can ask GL to make a 64 bit or even higher z-buffer.

    It can be done in software and we shouldn't care about performance.

    The same can be done for the stencil.

    V-man
    V--man

  2. #2
    Senior Member OpenGL Guru
    Join Date
    Feb 2000
    Location
    Sweden
    Posts
    2,982

    Re: My own GL buffers

    This is an implenetation/platform dependant issue. You don't ask OpenGL for a specific pixel format, you ask the platform for one, which OpenGL then uses.

    On Windows for example, you use Win32 API function to create a pixel format (this is where depthbuffer is specified), then you create a context using wgl functions, platform dependant functions for Windows only. When this is done, you start using OpenGL.

  3. #3
    Senior Member OpenGL Pro
    Join Date
    Sep 2000
    Location
    Santa Clara, CA
    Posts
    1,096

    Re: My own GL buffers

    Also, think before you say that you want a 64-bit depth buffer, or even a 32-bit depth buffer. When vertices are specified as floats (23 bits of mantissa), and vertex computations are performed on floats, more than 24 bits of depth precision is virtually useless. Several graphics products that have claimed to have 32-bit depth buffers actually can't use their 32 bits to any advantage, due to this limitation.

    - Matt

  4. #4
    Junior Member Regular Contributor
    Join Date
    Dec 2000
    Location
    montreal
    Posts
    105

    Re: My own GL buffers

    I'll try to reply to both here.

    Yes, I realize that GL avoids doing the pixel format business and leaves that to the OS, but I'm sure it could be made possible to have GL create it's own buffers and ignore the z-buffer that is probably present in video memory (maybe that can be deallocated).

    I did not think about the float precision here, but even with 24 or 32 bit precision, there could be z-fighting. How about computing the z-buffer related numbers in 64 bit or 80 bit on the cpu (x86) or even higher on other special hardware. Not much point in having it done on the GPU anyways.

    Any GPU chip makers thinking about doing the float computation in 64 bit in the future?

    V-man
    V--man

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •