rumors: new hardware (shader model 4.0) geometry shaders and bitwise ops

hello everyone,

i’ve been out of the hardware accelerated loop for a little while… recently i was googling to find out if there was any talk about bitwise ops and shaders. i ran into a lot of crazy stuff talking about what sounded like science fiction to my ears, geometry shaders and such…

so i was hoping someone might be kind enough to fill me in on the news, point me in the right direction.

talk about what hardware supports this stuff (xbox360 seems to) …and what the hardware realities are versus the specs.

yes i realize this has likely been heavily discussed here, but if its not a super boring topic, i thought i might try to fire up something… and of course if there is relevant on going discussion please share and my infinite apologies for my laziness.

oh, and please patient people only. i’m bad about inspiring mellodrama as some may remember.

sincerely,

michael

No idea… There is no such hardware by now, but you may like to download the DirectX december SDK(I hope I’m not mistaken by the name) and read the DX10 documentation.
But there is no point on starting such a diskussion because there is actually nothing to disscus :-/ We all can “guess”. Maybe you can ask on beyond3d, they will shure give you hell of info about guessing :wink:

thats good to here… its hard enough to keep up with this stuff.

here is an article mentioning said developments btw:
http://www.gamedev.net/reference/articles/article2274.asp

i’m assuming it is based at least somewhat in reality.

search down to:

“It will also introduce Shader Model 4.0, which provides a common shader model for the vertex, pixel, and geometry shader. It provides:”

Full integer/bitwise instruction set

* Custom resource decompression schemes
* More robust flow control and logic
* Uses: FFT, Codecs

and:

“The geometry shader joins pixel and vertex shaders, and presents some interesting possibilities:”

i dunno, maybe the opengl arb will be slow to pick up on this stuff… i just hope not too slow. sounds pretty revolutionary.

any other takers?

GLSL already supports integer types and bitwise ops without restrictions

Hope that ARB works on a geometry shader extension. There is still some time till first cards that can do it appear and even more time till they become popular(and even more time till I buy one :slight_smile: )

If you look at the raw binaries for the latest Nvidia drivers, you’ll see the following strings, which appear to be opcodes for some kind of assembly-level geometry shading extension:

XOR SHR SHL SET OR AND NOT NEG
MOD DIVVS DIVSQ SCC RRO LIF
I2I I2F F2I F2F
CEIL MEMB INDX MSWZ JMAT
RSPLIT RJOIN SJOIN JOIN REF MERGE
IPA MVI MVC POP PSH MVA A2R IMV OUT AGGR
CALL RETC IFE BREAK FORLOOP BRB JT JF JMP
VARY UNIF CNST UND

It appears that there might also be texture sampling capability in the geometry shader.

It would be more interesting to talk about what you can do with geometry shaders - it’s clear that the harware needed to support this feature is not something you have now - you’ll have to throw away your current graphics card and buy something new as usual.

I am sure OpenGL will have something similar to DX10, it just takes time.

Originally posted by Eric Lengyel:
If you look at the raw binaries for the latest Nvidia drivers, you’ll see the following strings, which appear to be opcodes for some kind of assembly-level geometry shading extension:

Didn’t they use a unified driver model? Maybe there’s not a GL version yet and they use it to parse a common model.
Anyway, I checked recently the SDK and I didn’t like there was support for what we call NVX_conditional_render. The Direct3D slang is predicative rendering.

Another really interesting thing (check out the demos): using a geometry shader to expand the vertex stream and MRT (it now seems to be accessible as array but I still have to check), the documentation claims possible to render a whole cube map in a single pass.

There are some things called ‘views’ which smells like uberbuffers to me (but I admittedly never put much attention on the issue).

I’m pretty sure xbox360 will not support it.

The SDK documentation also claims batch performance to be much improved. Maybe I’ll check it out (GL-d3d9-d3d10). Anyway, it’s just pre-release so I believe we all agree it’ll take some time before we can ‘play’ with those things.

I’m not sure but I think ASM shaders are ‘now’ allowed only in debug mode. :rolleyes:

I wonder if there will be much change to the API with the introduction of Geometry shaders?.

The Geometry Shading Stage, as proposed in DirectX 10, is a stage that sits between Vertex and Fragment Shaders and can do things such as creating additional vertices and stuff like that. I tried to test it, but it needs Vista and DirectX 10.

I’m sure if it will be implemented in Hardware, OpenGL extensions will be avaiable.

If you use Geometry Shaders with DX10, the whole shading stage is emulated in software, so you won’t have any real use for it at the moment. its just for testing it.
When Hardware implementations and Drivers will be avaiable, you can be sure at least nVidia will provide GL extensions. it may take some time until they (or some slightly modified versions) become ARB-approved, but be sure, Gl doesn’t stop development :slight_smile:

The GL API won’t need much changes. a glCreateShader(GL_GEOMETRY_SHADER) or something like that should be all :wink:
and, of course, something like GLSL 2.1 :wink:

i apreciate the clarification everyone.

sorry btw, i lost track of the thread (and forum) …it kinda slipped my mind i’m afraid.

entirely understandable, given the demands made on your time. It was good of you to contribute in the first place. I had assumed your modem had packed in, or the water wheel stopped turning.

A while ago I talked with some NVIDIA guys about extensions for SM4.0 hw and they only confirmed that there will be some. I wasn’t able to learn whether ARB,EXT or NV.