GL_EXT_texture_sRGB_decode

New spec!!

http://www.opengl.org/registry/specs/EXT/texture_sRGB_decode.txt

Thanks for sharing it.

However, I have one comment to the extension:
Why we need the DECODE_EXT and SKIP_DECODE_EXT tokens? I think a simple TRUE/FALSE value with TRUE as default should have been enough.

Ahah I had exactly the same thought!

Another example of lack of consistency in my opinion.

That was the original idea, but after some deliberation we thought it was more clear to have the explicit tokens. Plus we could possibly add more decode modes in the future, though it’s doubtful that would happen (what that would even mean).

And having a texparam take TRUE/FALSE isn’t necessarily more consistent - almost all take a specific enum; only GenMipmap takes TRUE/FALSE.

given this extension and 2 internal formats - one srgb and one non-srgb with no other differences (e.g. GL_RGBA8 and GL_SRGB8_ALPHA8), is there a reason why one should ever use the non-srgb format for anything when he can always use the srgb one - just disable the srgb decoding (or encoding when rendering into it) when needed?

If there is no performance or other reason, then we come again to the question why we needed different formats for srgb in the first place when it could just be a (queryable) property of the format - does it support srgb decode/encode or not.

AFAIK sRGB conversion is not free on NVIDIA cards (like it is on AMD cards), except of a few common formats (like SRGB8_ALPHA8).

If there is no performance or other reason, then we come again to the question why we needed different formats for srgb in the first place when it could just be a (queryable) property of the format - does it support srgb decode/encode or not.

It needs to be part of the format because only the 8-bit per-channel formats support sRGB decoding. So only those could offer it.

Also, it allows for render targets in sRGB formats. Note that the extension is about sRGB decoding, not sRGB encoding.

AFAIK sRGB conversion is not free on NVIDIA cards

It’s not? Do you have a link to information on that?

Ok, only 8-bit formats support SRGB decoding, so only they will have this “queryable property”. Maybe a new gl query function can be used here, e.g. glGetInternalFormatParameteriv(GLint internalformat, GLenum pname, GLint *params) where pname can be GL_SUPPORT_SRGB_DECODING. For formats with GL_SUPPORT_SRGB_DECODING equal to FALSE, enabling the srgb decoding will have no effect.

Also, it allows for render targets in sRGB formats. Note that the extension is about sRGB decoding, not sRGB encoding.

we already have separate queryable property if sRGB encoding is supported when rendering into this format - FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING. We could also use the new query function check the format before we have created/attached any textures/renderbuffers e.g. GL_SUPPORT_SRGB_ENCODING.

Or we could just use the API as it is now. There’s nothing particularly wrong with the way it works now.

What is wrong with the API as it is now?
a) If we don’t count this new extension, there is the problem that we can not dynamically change the sRGB decode of a texture although the hardware is able to do it. This is actually needed and used by various d3d9 games, where it is supported.
b) If we count the new extension, then the functionality is present, but the API is now somewhat more bloated and less clean. The non-sRGB variants of the sRGB formats are redundant (they are actually the sRGB formats, only with forced disabled sRGB decoding/encoding). Also there is a logical inconsistency - the sRGB formats, despite their names, are not actually [exclusive] sRGB formats, but instead they are such formats which merely support sRGB. They can function as either sRGB or non-sRGB depending on switches.
Prior to this extension they were exclusively sRGB indeed, but not anymore. This adds up another bit of confusion and strange-ness to the API. Of course this is my subjective POV.

If we don’t count this new extension, there is the problem that we can not dynamically change the sRGB decode of a texture although the hardware is able to do it.

Yes, but your suggestion cannot miraculously go back in time and add this ability to the specification before now. It would have to go through an extension process just like everything else. So we would still be unable to do it until your suggestion is past.

Retroactively saying that X would have been better if we had done it as Y does little to solve the problem today. However much you may think that sRGB_decode doesn’t make sense, it does in fact work.

The non-sRGB variants of the sRGB formats are redundant (they are actually the sRGB formats, only with forced disabled sRGB decoding/encoding).

Except for the issue I pointed out. Namely, formats that aren’t allowed to do sRGB decoding at all. Your “solution” involves adding a clunky API for testing whether a format can do sRGB decoding.

Also there is a logical inconsistency - the sRGB formats, despite their names, are not actually [exclusive] sRGB formats, but instead they are such formats which merely support sRGB. They can function as either sRGB or non-sRGB depending on switches.

How is this logically inconsistent? The terminology simply changed. Before, “sRGB” formats meant, “This format always does sRGB decoding.” Now, it means “This format may do sRGB decoding.” There is still a strict separation between formats that can do decoding and those that cannot. All this extension does is change “always” to “may”.

Prior to this extension they were exclusively sRGB indeed, but not anymore.

And prior to sampler objects, texture parameters came only from texture objects.

That’s just how OpenGL works.

This adds up another bit of confusion and strange-ness to the API.

First, I don’t see how this is confused or strange. The format says whether sRGB decoding is possible, since only certain formats can support it. The decode flag says whether we will currently do decoding or not.

Also, if you’re going to argue on the basis of “confusion and strange-ness” in the API, this is not where to start. Really, there are dozens of other places that are in far greater need of a makeover. We still use glVertexAttribPointer in core, even though core doesn’t allow it to take pointers. There are far greater crimes against clean APIs committed in OpenGL than this.

It just seems bizarre that you single out sRGB decoding when there are far worse things going on.

I’m not underestimating the other crimes at all :slight_smile:
I was just talking about this particular one because the thread is about it.

My “suggestion” actually does have some practical benefits beyond the subjective domain of “confusion and strangeness”:
a) it allows decoupling of the sRGB decoding (sampling form sRGB texture) and sRGB encoding (rendering in sRGB texture) - a format can have or not each of the 2 properties independently of the other. Imagine a hardware that can render with sRGB encoding into some format, but can not sample with sRGB decoding from it. As it is now, the API does not support this.
b) it is a more general and flexible mechanism that can be used for other future format properties. For example some (older) GPUs don’t support other than nearest filter for floating-point textures. This too could use such query mechanism.

it allows decoupling of the sRGB decoding (sampling form sRGB texture) and sRGB encoding (rendering in sRGB texture) - a format can have or not each of the 2 properties independently of the other.

Again we already have this. In order to get sRGB encoding, you must glEnable(GL_FRAMEBUFFER_SRGB) (which is ignored when rendering to non-sRGB formats). Otherwise, you don’t get sRGB encoding. With EXT_sRGB_decode, we how have the ability to turn on and off decoding.

Imagine a hardware that can render with sRGB encoding into some format, but can not sample with sRGB decoding from it. As it is now, the API does not support this.

Such hardware does not actually exist. And it won’t exist in the future. Hardware gets better with time, not worse.

it is a more general and flexible mechanism that can be used for other future format properties. For example some (older) GPUs don’t support other than nearest filter for floating-point textures. This too could use such query mechanism.

So you want to add a feature that will only matter to 5+ year old hardware?

I think I’ve read it on some slides presented by AMD and NVIDIA on some of the conferences lately. There was a slide about which texture formats do the AMD and NVIDIA cards prefer, but I cannot recap the title of the presentation. I’ll try to find it (of course, it may happen that I am wrong, my memory is not perfect :D).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.