T&L ?!?!?!?!?!?!?!?!

nVidia talks a lot about it but where is it!!! I think I may be a bit confused with Vertex programs and stuff but I’m working in a project and it would be nice to use T&L… But how?
The best I made so far was to use vertex array range… is T&L exposed by this extension? I’m working on a Geforce2MX so I know I have suport for it…

T&L is nothing special in OpenGL.
I mean that the OpenGL commands are sent to the driver and the driver do what they must, so on a hardware T&L card, the driver will make all T & L operation with the hardware, while on non hardware T&L this will be done in software.

You don’t have to do anything to use hardware T&L except using OpenGL for T&L commands rather than doing your own.

easy isn’t it ?

T&L is used inside the OpenGL implementation of the different vendors if it’s available. you can’t turn it on/off (NVidia as an example). whenever you set up a modelview matrix or a projection matrix and you give OpenGL some vertex data, OpenGL has to transform the vertex. if it’s done by software or by hardware is not your decission. its handled at a lower level.
to use much T&L as possible, use much OpenGL stuff as prossible (lights, transformation matrices, …) instead of doing all by your own (custom light engine, custom transformation, …). if you want to light your scenes with selfmade calculation, try to mix it with OpenGL.

jan

Ingenu and jabe are true but there is another kind of TNL that give you nearly full access to the hardware : pixel and vertex shaders, you can see them as scripts that are computed on the GPU side.

Pixel and vertex shaders will actually be available first on NV20/xbox’s NV2A from NVIDIA (hmmm welcome to monopoly world. Well i don’t care as long as they go to the right direction…)

So, no your geforce 2 mx doesn’t have hardware Pixel and vertex shaders, but original TNL is there and it’s cool

[This message has been edited by holocaust (edited 02-28-2001).]

GeForce2 MX does have pixel shaders. They’re called register combiners.

  • Matt

mmmhhh … register combiners are not exactly pixel shaders as meant in dx8

Well, they can do the DX8 pixel shader math (in fact, better than it), they just can’t do the texture operations.

The DX8 pixel shaders are MS’s way of lumping together what are really two very different features in GeForce3: an improved version of the combiners and a dependent texturing engine. Our OpenGL extensions expose these as the separate features they really are. Of course, MS decided to drop some rather useful features and added some strange features not supported by any hardware (!).

  • Matt

Actually, the entire GeForce series has support for pixel shaders. How is it that nVidia has had so many demos on it for so long if the “only” card that supports it isn’t even out yet? DURRR!!! And I’ve seen the shader effect in DX8. Go to some good search engine and searth for “.fr-08”. It is a 64kb demo that produces an equivalent of 1.9 gigs of data… really cool! Even makes my GeForce2 GTS suffer. But then, my install of WinME is going south (again :P) and I need to find a way to get a hold of Win2000. Then I’ll see some performance outta that thing. The part with the pixel shader (I think :P) is with the spinny thing going round and round. It looks like a blur at first, but it is really of the spinny thing, but with more than a few layers upon it, all shifting about on a line.