Here is what I have for supported extensions as of 8.9 if anyone is interested. This is on a Radeon 3650
Depth Buffer Bits = 24
Maximum Texture Coordinates = 8
Maximum Texture Units = 16
Maximum Vertex Texture Units = 16
Maximum Vertex Attributes = 16
Max Texture units on this card = 8
Max Texture size on this card = 8192
Max Renderbuffer size on this card = 8192
Max 3D texture size on this card = 8192
Max Cubemap texture size on this card = 8192
Max Uniform Variables on this card = 512
Max Varying Variables on this card = 68
Max Vertices on this card = 2147483647
Max Indices on this card = 16777215
Max ViewPort width size = 8192
Max ViewPort height size = 8192
MultiSample Buffers = 0
MultiSample Samples = 0
Vendor ATI Technologies Inc.
Renderer ATI Radeon HD 3600 Series
OpenGL version = 2.1.7976 Release
That’s a joke, but a really bad one. When i ran into this limit some months ago on a Radeon 4xxx, i dropped ATI support altogether. It’s really disappointing, D3D10 already required 4096 uniforms, so this is clearly an artificial limitation.
My Gf9 too. And my Radeon Mobility X1600 had 4096 also, if i am not entirely mistaken. That was the main reason why i have lost all trust in ATI. They are completely bullshitting us.
Maybe this is a limitation only for ARB shaders? Maybe it’s a forgotten-to-modify constant in their driver code? Testing with a simple GLSL shader, that uses 1024-4000 vec4 uniforms can show the truth imho.
I did exactly that. That is how i actually discovered the limitation. I assumed 4096 uniforms were standard, so my glsl shader used that. When it failed on the more recent ATI cards, i checked how many uniforms were supported and were very surprised…
I actually posted that in a thread here and some ATI PR guy responded with some PR bla bla. You know you are not taken seriously, when they send the PR department. They can fool consumers, but to try to fool developers is simply rude.
Funny how nv’s 9500 can handle this, when the ATI’s 9500 can’t.
(sorry, I couldn’t resist taking a stab at it)
Anyway, 8192 3D textures is cocky, to say the least. 8192 = 2^13. 2^(13*3) = 2^39 = 512GB! Do they really expose such a large address space (considering even recent x64 intel CPU’s only expose 36, or was it 39, bits of physical address lines IIRC)?
Also, 2 giga-vertices, that’s a helluva lot of memory. Assuming xyz float’s, that’s 6GB. I fear I have to question if it can really handle this…
24-bit indices count though, that was… interesting.
8192 sized 3D textures means, that you can have 3D textures with one or two dimensions being up to 8192. That can be useful in many areas. Of course a 8192^3 sized 3D texture would most certainly fail due to memory/address constraints.
Max Vertices on this card = 2147483647
Max Indices on this card = 16777215
Well those values are IMO labeled incorrectly / misleading. I assume it is the values introduced with the glDrawRangeElements extension. ATI always returned those values, even on ancient hardware, which simply means they don’t care to return anything useful. The values are supposed to mean how many vertices/indices PER DRAWCALL are optimal for performance and are intended as a guideline. E.g. a renderer could use them to setup the maximum size for a queue where it caches indices to render. nVidia returns something like 4096, at least they did a few years ago.
I don’t know what you mean, where is that mentioned?
What’s the deal here I thought 8.9 has the full OpenGL3.0 chezbang
They so very much never promised that for 8.9. They had a list of extensions that they would be supporting in 8.9, as the first steps towards 3.0. But full 3.0 won’t show up until Q1 2009, which they already said.
When Ati exposes the bindable uniform buffer extension you’ll see a different set of limits for that extension, apart from the reported max vertex/fragment uniform components. My hunch is that the max on “uniform variables” is something different, probably just a limit on the default uniform buffer available to the shader(s).
[quote]When Ati exposes the uniform buffer extension you’ll see a different set of limits for that extension, apart from the reported max vertex/fragment uniform components.[quote]
That is entire orthogonal to the uniform limits. Even if your hunch is correct, that’s an internal implementation problem. They don’t have to expose uniform buffers to the user in order to use them internally.
It’s simply yet another piece of evidence that suggests that ATi will do exactly and only the bare minimum to make currently existing games run under GL.
Oh, and I wouldn’t put that “When” in bold just yet. Uniform buffers aren’t an ARB extension, nor are they core. So don’t expect ATi to raise one figure to expose them in GL.