PDA

View Full Version : glPolygonStipple



remdul
01-17-2011, 03:48 AM
Am I correct that GL_POLYGON_STIPPLE has been removed from the spec since GL3.x? I suppose the intended replacement is to drop fragments in the shader and use a texture for bitmask?

Bit of a pity, it was an elegant way to do smooth transitions between different LOD levels of geometry without complex shaders and doesn't have transparancy ordering problems like alpha blending. And alpha-to-coverage only works if MSAA is enabled.

Also, has anyone ever had compatibility problems with GL_POLYGON_STIPPLE in <GL2.x? For example in combination with multi-sampling? It seems to work just fine on nVidia hardware anyway.

mhagain
01-17-2011, 05:45 AM
I can't say for certain but I have a strong suspicion that GL_POLYGON_STIPPLE is one of those semi-exotic OpenGL 1.0 features that is unlikely to be hardware-accelerated on modern consumer cards.

Things working on NVIDIA is not normally an indication of them being supported elsewhere, as NVIDIA is well known to be quite lax with some of the stricter elements of the spec.

remdul
01-17-2011, 06:11 AM
Actually, I think one of the reason it appears well-supported (and fully hardware accelerated) by nVidia (and hopefully others) is that it has been used by CAD software, so I wouldn't call it 'exotic' per-se. And everyone knows nVidia plays very sweet to those customers (one of the things that lead to the GL3.0 controversy). I was hoping the same applies to other vendors.
Polygon stipple also appears to be used in certain window systems for GUI rendering.

I suspect that modern hardware has similar functionality in the chip either way. I suppose features like stencil buffer, scissor testing involve similar operations and even the pixel-ownership test is still there in the latest specs.

Personally I've found nVidia to be adhering to the spec rather closely compared to other vendors (*cough* Intel), as far as the PC platform concerned anyway.

aqnuep
01-17-2011, 07:24 AM
Polygon stipple is not supported in hardware, it is actually "emulated" in hardware using an internal fragment shader and texture, exactly the way how you would do it with core GL3.


Bit of a pity, it was an elegant way to do smooth transitions between different LOD levels of geometry without complex shaders and doesn't have transparancy ordering problems like alpha blending.

Actually polygon stippling has nothing to do with alpha blending, it is solved using alpha testing what is completely different as it is order independent as nothing is written to the depth buffer for the transparent fragments as they get discarded.

remdul
01-17-2011, 12:16 PM
Actually polygon stippling has nothing to do with alpha blending
I'm aware of that, what I meant is that if you crossfade between two LOD levels using alpha blending, you will be able to see the insides of the geometry, one needs to depth sort objects, and two models at 50% transparency don't add up to 100% for the parts that overlap. So, blending is icky for this.

mhagain
01-17-2011, 12:29 PM
In general terms, if a feature has been deprecated it's because either (a) it's not hardware accelerated on a sizeable enough proportion of modern consumer hardware, or (b) it's emulated through some other mechanism (that you can normally code yourself) on the same hardware. (This isn't an exhaustive list of reasons, but it serves for the purpose of this discussion.)

CAD customers are very different to most consumers, and generally use graphics hardware from the professional/workstation realm. That means NVIDIA Quadro, ATI FireGL, or whatever the current players in the market are. These cards typically accelerate parts of the pipeline and/or API that consumer cards don't. It's comparing apples to oranges.

In the case of NVIDIA, their OpenGL driver is known to be very forgiving of bad code. That's what I mean when I say that they're "quite lax with some of the stricter elements of the spec". Something working on NVIDIA is not a reliable indicator of the same thing working with other vendors.

Specifically with deprecated features, NVIDIA are on record as saying that they will never deprecate a feature in their driver.

So the moral of the story is that if you want your code to work reliably and perform well on platforms other than NVIDIA (or CAD hardware), don't use this particular feature.

remdul
01-19-2011, 01:43 PM
Probably yes, that's why I wanted to ask.


In the case of NVIDIA, their OpenGL driver is known to be very forgiving of bad code. That's what I mean when I say that they're "quite lax with some of the stricter elements of the spec". Something working on NVIDIA is not a reliable indicator of the same thing working with other vendors.
That is true, and I found this out myself. In particular, the default setting of the "disable error reporting" in the driver control panel should be outlawed in future specs. Concerning GLSL, if you explicitly define the version (#version directive) one can get pretty strict spec compliance. From experience I've found that the NV GLSL compiler is then as strict as ATI's .

(I hope I didn't sound like an nVidia fanboy/apologist here, NV still sucks in other ways.)

Lance Corrimal
01-11-2012, 05:02 AM
so what exactly do I do now, if I need to have something to replace glPolygonStipple with?

V-man
01-11-2012, 08:34 AM
option 1 : don't use GL 3 forward compatible context. Use OpenGL 1.1 - 2.1. glPolygonStipple is available.

option 2 : use GL 3 but create a backward compatible context (available on Windows, AMD and nVidia). glPolygonStipple is available.

option 3 : use GL 3 with a forward compatible context and don't use glPolygonStipple.

michagl
01-11-2012, 06:04 PM
Having worked with games from around the turn of the century I can assure you a lot of stuff has been deprecated / is not supported by drivers targeting older APIs like Direct X. Dither is probably the biggest one, but there are many.

I implemented a dither shader... and I gotta say, I was surprised. A dither filter looks better to me than any linear filter. It doesn't suffer from the left-right/top-bottom bias of the linear filters. The eye sees it differently, and even very amateur textures look very nice. Anyway, taking out dither really screwed the pooch for most older games. They look really bad with all the banding you get without dither for 16bit colour.

Colorkey is another thing that is very poorly if even supported.

EDITED: Oh yeah, I never heard of using stipple for LOD. Does it really look good / what games do that for instance?