Per pixel lighting

Hello guys,
I would like to know how to implement per pixel lighting in OpenGL. I have heard that in Direct3D it his done by dot3product feature.
Can I do it in OpenGL?
Doe’s the new extensions GL_EXT_TEX_ENV_COMBINE and GL_EXT_TEX_ENV_ADD makes life easier?
Can someone give me pointers to tutorial/articals that explain this issue?
Many thanks,
Yossi

Hi,
Try http://www.nvidia.com/developer.nsf
They have some demos and tutorials about it.
(Check the Advanced Per-Pixel Lighting demo).

Make sure you have one of the GeForce family installed

Is it possible to do per-pixel lighting with something else than a Geforce?
If yes how?

And is there a way of doing it that is supported by more than one vendor?

when you do the ppl without the registercombiners but with GL_EXT_texture_env_dot3, you can run it on geforce, geforce2, geforce3 (soon ) and radeon. voodoos are dead, and others have cards older than one year… so these are the only to support currently…

I wouldn’t go so far as to say Voodoo cards were dead. Driver support for them is, but still, a lot of people have them. I’m sure if you were to ask any major game developer if they would release a product that didn’t support Voodoo cards they would answer with a resounding no, for now.

[This message has been edited by DFrey (edited 02-21-2001).]

As far as I’m concerned, if it’s not a GeForce, there’s really no point supporting it.

  • Matt

Why does everybody think that the NV20 is going to be called the GeForce3?

I haven’t seen anything official from nVidia saying so.

When the GeForce256 was released, most people thought it would be the TNT3, but it wasn’t.

That said, I think GeForce3 would be a cool name for a graphics card.

End Rant

Sorry, I don’t know what came over me…

(edit): Just found out that the NV20 actually will be called GeForce3.

I guess that makes me look pretty stupid

j

[This message has been edited by j (edited 02-22-2001).]

If that ain’t the definition of bias, I don’t know what is Matt.

[This message has been edited by DFrey (edited 02-21-2001).]

Some website posted a picture of the alleged box for the Hercules NV20 based card and printed on the box in big bold green text was Geforce3.

Ah, I see.

two things, first, you cant do much things with voodoos except rendering faces… so when you support them, you can forget every nice thing of your geforce/radeon or even matrox…

second, i call it geforce3 because sometimes when i say nv20, guys dont know what it is… but the geforce 1 and 2, everyone knows… and then, as result, i say geforce3 so everyone knows what i mean (the nv20, in fact )

you cant do much things with voodoos except rendering faces

The same could be said of the majority of video cards in use. Remember, most people don’t even know what video card their computer has much less contemplate upgrading it. You can write your programs to a limited audience if you want, but I would rather shoot for as large an audience as was practical. Especially since I’m one of the ones that does not have and can not afford anything better at the moment than my TNT.

Even if people do contemplate upgrading
their graphics card, many can’t, because
it’s built-into their north bridge or
soldered on the motherboard.

I’m hoping there is a future where hardware
transform & lighting is as available as
multitexturing is today. Oh, wait, the i810
doesn’t multitexture, does it now?

Originally posted by mcraighead:
[b]As far as I’m concerned, if it’s not a GeForce, there’s really no point supporting it.

  • Matt[/b]

I think it’s time for nVidia to change their attitude, many consumers really don’t like it (me included). I think it’s a shame that nVidia didn’t take their responsibility for the 3dfx consumers after they bought up 3dfx. How much would it take to let a few of the old driver writers from 3dfx that you most likely have employed continue to update the drivers for Voodoo cards at least a year or so? No respect to the consumer => No consumer respect for you. If you looks at forums for gamers you’ll see that most old Voodoo owners are upgrading to Radeons, while few goes for GF cards.

Geez, it was a joke, get over it.

As to your question – how much would it take – it would be a far bigger burden than you probably think, and at the same time, it would distract us from supporting our customers and from focusing on producing future products. One of the concerns 3dfx management obviously had in looking for a deal was continued support for customers. However, I don’t think a lot of people realize just how poor 3dfx’s financial situation has been for the last 6 months. We’ve all heard horror stories about buying a product from a company that it turns out has gone out of business, leaving no options for support or product returns. If 3dfx had not made the deal they did, they probably would have already shut down all end-user support, whereas if you go to their site today, you’ll see that 3dfx customers still can get end-user support. (If you doubt me, a quick look at 3dfx’s balance sheet should convince you that the company was really on death’s door. Even though I did not know about the deal until the day it happened, it was plain as day to me last October that 3dfx as an independent company was done for. When I read message boards, I couldn’t understand how so many folks could be in such deep denial about it.)

I remember when we were the “good guys”. Now, for some strange reason, all sorts of people seem to think of us as some kind of “evil big corporation”. As far as I can tell, not a single thing in our behavior has changed; we’re simply on the other side of the “root for the underdog” equation. I don’t understand it. If striving to always put out the best product and wanting to be part of a successful, industry-leading company is a crime, consider me guilty as charged.

  • Matt

>When I read message boards, I couldn’t
>understand how so many folks could be in
>such deep denial about it.

You just WAIT until the AMIGA comes back!

ELVIS will introduce it!

Originally posted by mcraighead:
Geez, it was a joke, get over it.

Actually, in a way, it wasnt a joke. As far as you are concerned, you get paid to update and fix drivers and provide developer support for the Geforce cards. Your job isnt to support Voodoos and Radeons, so in a matter of speeking, you were telling the truth

[This message has been edited by LordKronos (edited 02-22-2001).]

mcraighead, i think one of the reasons some people now consider NV a “bad company” is because back in the old days, there wasn’t much proprietary features in your HW. sure, the TNT had 32-bit support and even a stencil buffer, but since the majority of the consumer boards didn’t, almost no games supported it back then.
now the situation is different, and there are probably a few owners of 3Dfx video boards who don’t like it when they take a look at their GeForce friends’ bumpmapped games with perpixel lighting.
just a thought, i’m probably all wrong

but i think NV should work more with ATI (and maybe Matrox, but … well, we’ll see what will happen to them in the next two years) to set more standards. i personally don’t care about supporting Voodoo[1|2|3|4|5], but i do care about supporting Radeon since it is about equal to the GeForce in features.
the ARB doesn’t have to approve the extensions, IMO. if there was something like NV_vertex_program and some pixel shader extension, and you both supported ATI’s np_triangle extension, it would be just fine.
the important thing is that extensions are supported by multiple vendors, not that they are called “ARB_something”.

i think NV is a damn fine company, i think ATI is almost as fine, but writing proprietary extensions is evil. and stupid… what’s the point of having one NV pixel shader extension, and one ATI pixel shader extension? (ATI is working on one, and i assume you too are)

work together, take OpenGL to the skies, and i’ll like you even more. i think that’s enough ranting for today…

Originally posted by mcraighead:
[b]Geez, it was a joke, get over it.

As to your question – how much would it take – it would be a far bigger burden than you probably think, and at the same time, it would distract us from supporting our customers and from focusing on producing future products. One of the concerns 3dfx management obviously had in looking for a deal was continued support for customers. However, I don’t think a lot of people realize just how poor 3dfx’s financial situation has been for the last 6 months. We’ve all heard horror stories about buying a product from a company that it turns out has gone out of business, leaving no options for support or product returns. If 3dfx had not made the deal they did, they probably would have already shut down all end-user support, whereas if you go to their site today, you’ll see that 3dfx customers still can get end-user support. (If you doubt me, a quick look at 3dfx’s balance sheet should convince you that the company was really on death’s door. Even though I did not know about the deal until the day it happened, it was plain as day to me last October that 3dfx as an independent company was done for. When I read message boards, I couldn’t understand how so many folks could be in such deep denial about it.)

I remember when we were the “good guys”. Now, for some strange reason, all sorts of people seem to think of us as some kind of “evil big corporation”. As far as I can tell, not a single thing in our behavior has changed; we’re simply on the other side of the “root for the underdog” equation. I don’t understand it. If striving to always put out the best product and wanting to be part of a successful, industry-leading company is a crime, consider me guilty as charged.

  • Matt[/b]

I understood it was a joke, but it had the tone of the overall attitude I see coming more and more from nVidia these days. nVidias attitude has most definitely changed. The care for the customers is gone. (Riddle: If ATi and nVidia were to define the getCustomerSupport(char *problem) function, how would they differ?
A: nVidias would return void.)
I see nVidida walking the same way as M$ and Intel, and it’s really sad. Sure, the underdog stuff is also a part of the equation. But in the end, ATi is the bigger company and have been so for a long time, but still they don’t have that attitude. 3dfx lost many customers because of their attitude. I think nVidia will do it too now, and especially their potential customners (the old 3dfx customers).

Originally posted by Siigron:
writing proprietary extensions is evil. and stupid…

This is one of those attitudes that I simply do not understand at all.

The inescapable fact is that 3D hardware is diverging. SGI previously set the standard (remember that what is today called “OpenGL” was previously a proprietary SGI standard). There were competing architectures, but the SGI architecture won out in the end.

Today, in the consumer space, people have finished copying virtually all of the SGI features. What does that mean? New innovation requires moving in new directions. This means that new chips will be fundamentally incompatible with each other.

This is not optional – in fact, it is a prerequisite for innovation. If every company moved in the same direction, there would be no way for any one company to produce everything novel!

The inevitable result of this inevitable trend is that there will be more and more proprietary extensions.

Furthermore, design cycles are long enough that companies can’t just “get together” and “agree” on a common feature set. Even if this does occur, the ARB is really not the best forum for it, for a number of reasons.

The choice is simple: you can either have proprietary extensions, or you can have no innovation. I think you’d rather have proprietary extensions.

  • Matt