Determining what extensions have hardware support

Hi,

I have a question regarding hardware support.

There are a couple of extensions I’m curious about trying. one is Paletted_Textures, the other one is texture compression.

My questions is this, When you query for an extension and it returns back as valid/usable does this mean the card supports it in hardware? Are the drivers responsble for returning back that information?
I guess this is even a more general questions, when you query for pixel format, extensions etc. are you querying what your card supports, or opengl in general (e.g. 1.1 etc)

Thanks

elroy

It is all from the card, well more precisely the driver that tells what resource it has. So if you receive a good value that means your card support it unless there is a big fat ugly bug in their driver!

What you’re getting from the query is what the driver knows about. But it doesn’t say whether it knows of a h/w solution or not; all it’ll tell you is that it won’t throw it’s proverbial hands up in horror if you ask for that extension.

Even SGI conceed (in the doc files included with IRIX6.5) that the ONLY way of getting an idea of whether something is done in hardware is to profile the machine. This may seem like a hack… but it makes tonnes of sense since the information you’re REALLY after is whether the opengl system can adequately support some extension, and NOT whether it can support it in hardware. (since adequately supporting something in s/w is more desirable than inadequately supporting it in h/w)

hope this helps!
cheers
John

Say a company make a new supercool accelerator based on the upcomming GeForce2 (to be extreeme), and they decice to make OpenGL drivers for it. If they say it’s OpenGL 1.1, they HAVE to support functions x and y. And if they call it OpenGL 1.2, they MUST support function z too. You find x, y and z in the OpenGL spec.
BUT!!! They have to support it, but there is nothing that says it’s supposed to be hardware accelerated. Ther reason why i chosed GeForce2 was that it can be a cool chip, but there is nothing that says the card manufacturers have to make their drivers use the chip’s features in hardware (but of course, who would buy a GeForce2 with drivers that doesn’t take advantage of all features in hardware, but it’s indeed not an impossible scenario). This applies to extensions aswell.

Thank you everyone for all the input. I did the query extension on the machine I’m using and unfortunately it doesn’t support it (PALETTED_TEXTURE). It would have provided a nice savings from a texture memory standpoint… I’ll probably check on a few other cards and see if it is even worth the trouble. If none of the cards are supporting it no point in implementing it a suppose… I’ll have to say it is kind of ironic. I finally find something that would solve quite a few my problems and then, discover, well no, probably can’t count on it. But I won’t give up.

Thanks again for your help

I don’t think pallated textures are very widely supported… They just aren’t that easy to implement in hardware. S3TC has pretty good support, though.

Well the Matrox G200 doesn’t report the paletted texture extension in its extension string. But if you try to get the function it works - go figure.

With that in mind, I always ignore the EXT string and just check whether the function pointer stays NULL…

Paul Groves
pauls opengl page
pauls opengl message board

[This message has been edited by Pauly (edited 05-27-2000).]

If your intention was to save texture memory, have a look at the “internalformat” parameter of glTexImage2D in the OpenGL 1.2 specs. There is more to it than true color or paletted textures. You might find RGBA5 or RGBA4 useful. Perhaps you find support of that on your card.

I have tried some of the other internal formats; RGBA444, RGB555 in particular. I tried these before heading down the paletted texture path… These work and do save video memory and I have stared using them but I still need to save more. I have tried R3_G3_B2 which for my needs would work almost as well as paletted textures but it seems to be ignored on some platforms. Matrox G400 in particular. Does anyone have any ideas on how to determine which internal texture formats are supported? This would be helpful as well.

From the earlier posts it seems that getting the extension string isn’t always accurate for determining support. Is this common on most cards?

Thanks again for all the help/answers