Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 5 of 5

Thread: framebuffer texturebuffer creation question

  1. #1
    Junior Member Newbie
    Join Date
    Feb 2017
    Posts
    10

    framebuffer texturebuffer creation question

    So I have a piece of code that decides if it should create a render buffer or texture buffer; and my question focuses on this part of it:

    Code :
    glTexImage2D(GL_TEXTURE_2D, 0, m_internalFormat, m_width, m_height, 0, type(), format(), nullptr);

    My question is: does specifying a `type`/`format` do anything if the data is NULL, and i'm basically just mallocing some stuff? I already found out that i can't use 0L for them, so it seems like they must do something.

    I thought it just had to match internal_format, but the wiki page specifies GL_R11F_G11F_B10F needs to function for both renderbuffers and textures, but there's no corresponding `type` for that. Does it not really matter what the type is, and the format just has to match? or what's the deal?

    This for example:
    Code :
    glTexImage2D(GL_TEXTURE_2D, 0, DEPTH_COMPONENT, m_width, m_height, 0, GL_RED, GL_UNSIGNED_BYTE, nullptr);

    Will create an error code, because it can't do the conversion. So it seems like i can't always use the same values for all internalFormats; or is it safe to just ignore that error code?
    Last edited by GeatMaster; 06-10-2018 at 05:07 PM.

  2. #2
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,394
    Quote Originally Posted by GeatMaster View Post
    So I have a piece of code that decides if it should create a render buffer or texture buffer; and my question focuses on this part of it:

    Code :
    glTexImage2D(GL_TEXTURE_2D, 0, m_internalFormat, m_width, m_height, 0, type(), format(), nullptr);

    My question is: does specifying a `type`/`format` do anything if the data is NULL, and i'm basically just mallocing some stuff?
    No. They describe the format of the provided data block, and if you're not providing one, they're not really used for anything. They probably need to be valid for error checking though, as you found out.

    I thought it just had to match internal_format
    No. The original idea was so you could have your texture be one format on the GPU (the "internal format"), potentially populate with texel data of another format (given by format+type), and the driver in some cases would do the conversion on-the-fly (e.g. populating GPU compressed textures with uncompressed texel data). However, if you're writing a high-performance app that strives for maximum visual quality, you'll almost never want to let the driver do a runtime conversion. You'd do the conversion beforehand using high-quality conversion/compression code and just load it in the GPU native format.

    Will create an error code, because it can't do the conversion. So it seems like i can't always use the same values for all internalFormats; or is it safe to just ignore that error code?
    I wouldn't get in the habit of ignoring error codes. It'll bite you further down the road. Just make the API call happy by providing a reasonable format/type.

    If you don't want to have to just "know" a reasonable format/type for every internal format, you can query what the driver things is a good format/type to provide using:

    Code cpp:
      glGetInternalformativ( target, int_format, GL_TEXTURE_IMAGE_FORMAT , 1, &format );
      glGetInternalformativ( target, int_format, GL_TEXTURE_IMAGE_TYPE   , 1, &type );

    You might want to browse glGetInternalFormat to see what other goodies you can query. Some of these can be pretty useful.
    Last edited by Dark Photon; 06-11-2018 at 05:23 AM.

  3. #3
    Junior Member Newbie
    Join Date
    Feb 2017
    Posts
    10
    Quote Originally Posted by Dark Photon View Post
    If you don't want to have to just "know" a reasonable format/type for every internal format, you can query what the driver things is a good format/type to provide using:

    Code cpp:
      glGetInternalformativ( target, int_format, GL_TEXTURE_IMAGE_FORMAT , 1, &format );
      glGetInternalformativ( target, int_format, GL_TEXTURE_IMAGE_TYPE   , 1, &type );
    That's perfect thanks!!!

    Unless my program is on a card that doesn't support openGL 4.2 I dunno how probable that is. How probable is that? do you know?
    Last edited by GeatMaster; 06-11-2018 at 07:23 AM.

  4. #4
    Advanced Member Frequent Contributor arekkusu's Avatar
    Join Date
    Nov 2003
    Posts
    881
    If you want to allocate storage for a texture, and not supply any pixel data, then use glTexStorage.

  5. #5
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,394
    Quote Originally Posted by GeatMaster View Post
    Unless my program is on a card that doesn't support openGL 4.2 I dunno how probable that is. How probable is that? do you know?
    Just do what arekkusu said. He's got the better solution.

    To your question though, you can look at recent reports here: https://opengl.gpuinfo.org/ But I suspect many of the submitters are developers, not end users.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •