PDA

View Full Version : Rendering quality of graphics primitives



Qu0ll
12-31-2009, 11:24 AM
I come from a Java background and I am curious as to why the rendering of primitives like lines, ellipses and curves in OpenGL results in a much coarser appearance than with a CPU-based graphics toolkit like Java2D. Even when using multisampling and with the maximum antialiasing settings configured in the graphics card driver, Java2D consistently renders these primitives much smoother.

Is there some inherent reason why rendering of basic graphics is of poorer visual quality in OpenGL? I don't quite understand because surely both are addressing the same frame buffer and manipulating the same pixels?

Alfonse Reinheart
12-31-2009, 01:49 PM
First, this has nothing to do with OpenGL per-se. You'd get the same results from D3D. This has to do with graphics cards.

Second, graphics cards do not specialize in rendering lines, curves, and ellipses. They specialize in rendering triangles that use programs run locally on the GPU to generate the vertex data and pixel data of those triangles. I'm guessing that Java2D isn't very good at doing that.

What matters to graphics cards is that particular situation: triangle rendering with programs that determine what the output is like. Drawing a line with a particular color is not something they spend a lot of silicon on.

Third, graphics card makers care about performance first and foremost. Can you render a complex Java2D scene at 75fps, where you're pulling from hundreds of megabytes of image data per frame? I rather doubt it.

Graphics card makers simply do not care about high-quality line antialiasing. That kind of specialized scan conversion is simply not an important case for cards that are primarily meant to play games on.

Dark Photon
12-31-2009, 06:12 PM
I come from a Java background and I am curious as to why the rendering of primitives like lines, ellipses and curves in OpenGL results in a much coarser appearance than with a CPU-based graphics toolkit like Java2D.

Have you tried:


glEnable (GL_LINE_SMOOTH);
glEnable (GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glHint (GL_LINE_SMOOTH_HINT, GL_NICEST);
glLineWidth (1.5);

Dark Photon
12-31-2009, 06:15 PM
Graphics card makers simply do not care about high-quality line antialiasing. That kind of specialized scan conversion is simply not an important case for cards that are primarily meant to play games on.
Alfonse, this is just flat false. The entire world isn't games. You forget (or don't realize) that CAD/DCC is a big market for GPU vendors. Since the early days of high-end graphics cards, line AA has been supported. I believe this has been pushed down to the low end as well. There's nothing super hard about it.

Alfonse Reinheart
12-31-2009, 08:25 PM
Since the early days of high-end graphics cards, line AA has been supported. I believe this has been pushed down to the low end as well.

You will note that I said "high-quality line antialiasing." What you get with GL_LINE_SMOOTH is not high-quality compared to any decent software 2D rasterizer.

Hardware line scan conversion is functional, not good.