Blending errors on Nvidia GF4MX460 (OT?)

Hello all!

This might be a little be off topic, but perhaps there is an on topic solution.
My problem: I have a new ASUS V8170pro Grafics card with a GeForce 4 MX 460 and 64MByte. When I use additive blending (GL_ONE,GL_ONE) I see bright lines at the edges of adjacent triangles. If you know the Microsoft software renderer, you know what I mean. These lines dissappear when I use FSAA (2x) or if I use my ‘old’ GF2Ti. I use Win XP and the 28.32 Drivers.
Have the glHint’s anything to do with this behaviour? In my initialisation are the following lines:
glHint( GL_POLYGON_SMOOTH_HINT, GL_DONT_CARE );
glHint( GL_POINT_SMOOTH_HINT, GL_DONT_CARE );
glHint( GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST );

Can I somehow set a flag or so, so that adjacent triangle edges are not rendered twice?
Thank you in advance.

Marc

What kind of primitives? Try triangle strip.

Stencil could help, but …

V-man

Quads, quad strips and triangle strips.

If I’m right, in OGL adges shouldn’t be rendered twice, isn’t it?

Originally posted by Marc:
If I’m right, in OGL adges shouldn’t be rendered twice, isn’t it?

If you mean that 2 triangles which share an edge will not both draw the same pixel, that is correct. Every pixel along a shared edge will “belong” to one triangle or the other, never both.

OK, and my problem is, that on this special graphics card the pixels along shared edges are drawn twice resulting in a bright line around every blended triangle (as long as the blend color isn’t black). I think this is a flaw in the MX460-section of the XP drivers (is the OGL part different from that for the GF2Ti ?:frowning: ).
Ahhh, I forgot to mention that the rest of the computer is an ASUS A7N-266C motherboard (Nvidia-chipset) and an Athlon XP 2000+. On the same board with the same driver everything runs fine with a GF2Ti (NO new driver installation, the driver/XP detected the different graphics card without any complaint).
Perhaps this helps someone from Nvidia. I’m sorry, that I’m not able to deliver any images at the moment except by eMail.

I dunno if that helps but…
http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/006483.html

Y.

The hints alone would not do that.
From the given behaviour you have glEnable(GL_POLYGON_SMOOTH) somewhere, and pixels between polygons get coverage values calculated from both trangles sharing an edge. With the “wrong” blending modes this will give seams.
And from the thread hinted above, this is not happening in multisample modes, because polygon smoothing is switched off then.
Standard rendering produces no double hits or gaps on edges shared by two polygons. Only apps using geometry with T-vertices will show that.

I switched off the GL_POLYGON_SMOOTH and everything works fine!!! THANKS!
Why do all these quality options give bad results? Why are these options ingnored by old graphic cards and interpreted wrong on new ones? Any ideas?

“Why do all these quality options give bad results?”

They don’t, if they’re used as they were meant to be.

Look at the Red Book chapter 6 on Polygon Antialiasing:


Now you need to blend overlapping edges appropriately. First, turn off the depth buffer so that you have control over how overlapping pixels are drawn. Then set the blending factors to GL_SRC_ALPHA_SATURATE (source) and GL_ONE (destination). With this specialized blending
function, the final color is the sum of the destination color and the scaled source color; the scale factor is the smaller of either the incoming source alpha value or one minus the destination alpha value. This means that for a pixel with a large alpha value, successive incoming pixels have little effect on the final color because one minus the destination alpha is almost zero. With this method, a pixel on the edge of a polygon might be blended eventually with the colors from another polygon that’s drawn later. Finally, you need to sort all the polygons in your scene so that they’re ordered from front to back before drawing
them.

(Or search the forum, you were not the first with this problem.)

"Why are these options ingnored by old graphic cards and interpreted wrong on new ones? "

Older chips (or drivers) probably didn’t support it. Newer ones do the right thing (even if it’s slow sometimes).

Oooops…Hmmm…should read more…

Thanks!