PDA

View Full Version : RGBE8 texture format



Bert
09-28-2003, 07:17 AM
Where coud I find information about the RGBE8 texture format? I found it on page 37 in this NVIDIA presentation: http://developer.nvidia.com/docs/IO/8374/CEDEC2003_PipelinePerformance.pdf

Sounds interesting. I guess the fourth component (E) is the exponent for the RGB components. Still only 8 bits resolution, but over a very wide range.

Will there be an extension exposing this? What is the enum value? What hardware supports this? Any more information as to what operations can be used on these textures?

roffe
09-28-2003, 08:26 AM
RGBE info

Sample code: http://www.graphics.cornell.edu/online/formats/rgbe/

Format:
Real Pixels,Greg Ward, Graphics Gems II

Bert
09-28-2003, 10:42 AM
Thank you! I faintly remember having read something about this. So Gems 2 it was http://www.opengl.org/discussion_boards/ubb/smile.gif

Any idea how it is supported in OpenGL?

davepermen
09-28-2003, 11:23 AM
well, for GL_NEAREST filtering, you can just abuse alpha for e, and sample like that. you can extract them then with a fragment program easily. compressing a float3 to rgbe in a fragment program is a bit more work. but for hdr-range colourbuffers this can be faster than using a float4 rendertarget texture for example..

roffe
09-28-2003, 12:21 PM
Sample code from above link, should convert to fragment shader code pretty easily. frexp(),ldexp() can be emulated. CG does this.




static INLINE void
float2rgbe(unsigned char rgbe[4], float red, float green, float blue)
{
float v;
int e;
v = red;
if (green > v) v = green;
if (blue > v) v = blue;
if (v < 1e-32) {
rgbe[0] = rgbe[1] = rgbe[2] = rgbe[3] = 0;
}
else {
v = frexp(v,&amp;e) * 256.0/v;
rgbe[0] = (unsigned char) (red * v);
rgbe[1] = (unsigned char) (green * v);
rgbe[2] = (unsigned char) (blue * v);
rgbe[3] = (unsigned char) (e + 128);
}
}

static INLINE void
rgbe2float(float *red, float *green, float *blue, unsigned char rgbe[4])
{
float f;
if (rgbe[3]) { /*nonzero pixel*/
f = ldexp(1.0,rgbe[3]-(int)(128+8));
*red = rgbe[0] * f;
*green = rgbe[1] * f;
*blue = rgbe[2] * f;
}
else
*red = *green = *blue = 0.0;
}



[This message has been edited by roffe (edited 09-28-2003).]

Bert
09-28-2003, 01:52 PM
Thanks guys, but that was not what I wanted to know. That's trivial. We're in the advanced forum, right? http://www.opengl.org/discussion_boards/ubb/wink.gif

Actually I got the impression RGBE was usable as internal texture format. Why would NVIDIA list it if not? It wouldn't make much sense if this was a normal RGBA format and I had to do all the work myself in a fragment program. From the presentation, RGBE shares all limitations of NV_float_buffer texture formats (no blending and filtering).

So my best guess would be this is an extension of NV_float_buffer formats providing significant (and hence interesting) compression. Perhaps if I sample such a texture it will just show up in my fragment program as float? But to test this I at least need the right value for GL_RGBE8.

Of course, the presentation was all about DirectX, but if the hardware can handle it, I'd like to see it exposed in OpenGL http://www.opengl.org/discussion_boards/ubb/smile.gif

SirKnight
09-28-2003, 06:00 PM
RGBE...interesting. I never heard of this format before. Of course I never got to read the Graphics Gems books so that's probably why. http://www.opengl.org/discussion_boards/ubb/smile.gif

I have a couple questions though.

1) The float2rgbe function. Would that take say a HDR image and convert it to the RGBE format?

2) What are the differences between frexp() and ldexp()?


-SirKnight

davepermen
09-28-2003, 11:44 PM
sirknight. you could simply google for them.. thats the way i got how they work and what they do, too http://www.opengl.org/discussion_boards/ubb/biggrin.gif

roffe
09-29-2003, 01:16 AM
Originally posted by Bert:
Actually I got the impression RGBE was usable as internal texture format. Why would NVIDIA list it if not?

Maybe they want to know if developers have an interest in this format? I wouldn't be surprised if it worked under DX though. Do NVIDIA drvs expose floating point texture support in DX nowadays?



It wouldn't make much sense if this was a normal RGBA format and I had to do all the work myself in a fragment program.

I'm fairly sure deciding on what gets hw accelerated in the next generation chip takes quite some time. Only the most popular features gets implemented. How popular is the RGBE format compared to other HDRI formats? How often is it used except by Radiance users? Maybe someone knows? I guess since it is a 32bit format it should have some great potential for hw implementation.

But since using fp textures requires fragment programs and the rgbe to float conversion only translates into a few instructions, I doubt we will see hw support for RGBE any time soon. But what do I know.



Of course, the presentation was all about DirectX, but if the hardware can handle it, I'd like to see it exposed in OpenGL http://www.opengl.org/discussion_boards/ubb/smile.gif

Wouldn't we all.

SirKnight
09-29-2003, 04:50 AM
Originally posted by davepermen:
sirknight. you could simply google for them.. thats the way i got how they work and what they do, too http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Yes I know but since I was in the middle of writing my post I figured I'd just include that. But seeing that I got no answer for that I'll just go google now. http://www.opengl.org/discussion_boards/ubb/biggrin.gif


-SirKnight