GLSL textureRect / samplerRect on ATI

I’m trying to find a rectangular texture solution that works on both nvidia and ATI graphics hardware - preferablly working on cards that are NOT the newest newest expensive ones. I have GeforceFX5900, and use textureRect / samplerRect very successfully in my GLSL shaders.

ATI seems to not support this the rectangle texture target in GLSL, giving compiler errors for the use of textureRect / samplerRect

Has ATI made any final decisions or plans about whether they will support texturing from the rectangle texture target in GLSL, and what hardware will or won’t support it, and when?

Currently it seems like the only way to achieve support for rectangular texturemapping in GLSL on both NVidia and ATI is to force textures to use GL_TEXTURE_2D, increasing the required size up to the nearest power-of-two size. If ARB_non_power_of_two is available, this is a fine solution and the textures dont need to be oversized (wasting memory), but obviously only the newest generation of cards support that extension.

I’d really like to avoid forcing the use of TEXTURE_2D (on NVidia especially where TEXTURE_RECTANGLE works well), as my application very much depends on rendering to rectangular textures of both RGBA8 and RGBA float formats, and then texturing from them, often at fairly high resolutions. If I run at 1280 x 720 and size my FBO’s/PBuffers using TEXTURE_2D, 2048x1024 textures eat memory up very fast.

Anybody have a good solution for rendering to rectangular sized buffers and texturing from them with GLSL on GeforceFX59xx series and ATI Radeon 9800+ series cards? Am I going to have to write or auto-generate two sets of shaders, one using textureRect for NVidia and another using texture2D for ati?

This is a bit of an unfortunate situation.

here’s an example of the kind of stuff I do -
http://www.treyharrison.com/glslWater.jpg
http://www.treyharrison.com/glslWater.mov

I realize that in this example I am actually rendering to a power-of-two sized buffer (256x256), but my question still stands. I have a video mixing construction that renders DV/DVD resolution (720x480) with pixel shader effects, and definitely needs some kind of rectangular texture support. That screenshot would take several posts though =)

trey

Just a guess but if you actually read the ARB_texture_rectangle spec:
http://oss.sgi.com/projects/ogl-sample/registry/ARB/texture_rectangle.txt

It states:
Add the following (previously reserved) keywords to the first part of
section 3.6 on page 14:

    sampler2DRect
    sampler2DRectShadow

Add to section 8.7 "Texture Lookup Functions"

Syntax:

    vec4 texture2DRect(sampler2DRect sampler, vec2 coord)
    vec4 texture2DRectProj(sampler2DRect sampler, vec3 coord)
    vec4 texture2DRectProj(sampler2DRect sampler, vec4 coord)

So try texture2DRect?

You may use TEXTURE_2D on ATI cards, without resizing to the nearest POT, as long as the driver supports GL 2.0 and you don’t violate the ARB_texture_rectangle limitations (no mips, etc).

You’d still need to support two texture sampling modes though (normalized and non-normalized). Not very difficult IMHO.

sqrt, TextureRect/Texture2DRect both generate compiler errors on the ATI x800 I tested with. Both Texture2DRect and TextureRect compile and run on my nvidia card, but Texture2DRect issues a warning saying that the spec does not define Texture2DRect.

Naming it either way, ATI doesn’t support it and generates a compiler error.

spasi, writing a shader that uses normalized coordinates for ATI and having another shader that uses RECT style coordinates for NVidia is … a technically correct solution… but certainly not an elegant one. My software is (in part) a shader design tool, and I don’t want myself or my customers having to write two shaders for every effect. We shouldn’t have to - the whole point of having standards is to reduce the amount of code required to support devices from different manufacturers.

I may end up having to write some automatic shader code converters to compensate for the shortcomings of the ATI cards, but before I go down the path of doing all that work, I’d reallly like to get an official word from ATI about this.

Does anyone know a reliable person to contact there who can give an accurate answer?

Is ATI content to just have their GLSL compiler ignore the textureRect/texture2DRect portion of the spec? What is being done, or what will be done about it?

trey

There is a very good reason for ATI not supporting textureRect and its not technical, but i think its political. You see I ran into the exact same problem while doing post processing effects for my engine. The R3xx provided good support for NPOT textures with good enough performance to be useful. On the other hand NV3x came down on its knees when NPOT textures were used. So i think that this was a make shift kind of solution that nVidia came up with till they provided good enough solution for NPOT textures in their successive generations. It actually came from CG (correct me if i am wrong), since it had texture rects before the ARB extension.

Coming back to the problem at hand, texture rectangles are not a good solution since they are misfits amongst all other type of textures with normalized texture coordinates! So they are actually the problem! Had there been good NPOT support from the very start, they probably would never have made it to where they are now. I see them as creating more confusion than solving problems. Believe me, this is probably the tenth time i am answering questions relating to texture rects in the past 4 weeks or so.

So your have actually three options.

  1. Write seperate codepaths for hardware supporting texture rects and those not supporting it - this is big a fuss. i did this so i can tell
  2. Use 2D textures, rounded up to the nearest power of 2 - too much wasted memory
  3. Use NPOT textures everywhere. Sure your performance will be crippling on NV3x hardware, but ATI cards and NV4x and G7x series cards, get less performance hit. This is more future proof, and easy to use and NV3x is only the misfit in between. I think NV3x would be pretty much history within another year or so. Plus you can actually use the 2D texture tecnique for NV3x hardware, and use NPOT textures for everything else.

@Zulfiqar Malik
Texture rectangle has been around since Geforce 3 days. Support for it seems to extend back to at least Geforce 2.

Originally posted by Trey:
We shouldn’t have to - the whole point of having standards is to reduce the amount of code required to support devices from different manufacturers.
My point is this: Almost all engines eventually end up with a separate codepath with specialized shaders for NV30s anyway (for performance reasons). So, using rectangle textures on NV30s shouldn’t add too much extra work and you can use NPOT tex2Ds on everything else.

Though, I can see how this situation can be awkward in a non-game engine scenario.

@sqrt[-1]: You may be right, but hasn’t CG been around for approx. the same? If i am not mistaken 9700-pro (R300) came out in 2002 and was the first card to support DX9, which required conditional NPOT support.
Anyway, that’s besides the point.

Not tryng to start a flame war but:

Geforce 3: ~late 2000(paper) early 2001
Radeon 9700 and Cg: late 2002

So my point was that Nvidia would probably not have invented an extension 2 years (eons in graphics) prior to the R300 as they “somehow” knew that their future cards could not compete on NPOT textures.