PDA

View Full Version : Does ATI Radeon support hardware T&L ?



PossumTT
03-17-2001, 07:19 PM
Hi,

I am not a programmer, nor very familiar with OpenGL and the ATI Radeon.

Any info I can pass on to a friend regarding the topic would be much appreciated.

Does anyone know if Apple's OpenGL 1.2.1 supports hardware T&L on the ATI Radeon card (BTO or retail) ?

TIA,

PossumTT

harsman
03-20-2001, 02:32 AM
If the card has HW T&L, any OpenGL implementation will support it. And yes the Radeon does T&L in hardware.

PossumTT
03-21-2001, 06:02 AM
Harsman,

Thanks for your reply. Again, I am not a programmer, but would like to pass on any info I can regarding Mac ATI Radeon hardware T&L. Is there somewhere on the Web you could point me to help me gain more knowledge on how to implement hardware T&L on the Radeon ? More specifically, where I could point my programmer friend ?

PossumTT

harsman
03-22-2001, 04:28 AM
OK, first of all I'm assuming ATI supports MacOS with drivers, if the driver is from a third party (like the DRI drivers on Linux) then the driver writers may not know exactly how the hardware works and therefore not all functions will be accelerated. Secondly, you don't have to "do" anything to enable hardware T&L in OpenGL, you just setup the modelview matrix and send in your untransformed vertices (if you don't get this your programmer friend will). The reason some games don't support hardware T&L is they do the calculations themselves on the CPU, that was faster when the gfx cards didn't do T&L. To get good performance on a HW T&L card, tell your programmer friend to make sure to use vertex arrays, avoid state changes and never ever read back results from the card. It really isn't more complex than that, you just throw as many triangles at the card as possible, avoiding CPU work.

[This message has been edited by harsman (edited 03-22-2001).]

PossumTT
03-22-2001, 06:14 AM
Thanks Harsman http://www.opengl.org/discussion_boards/ubb/smile.gif

PossumTT