View Full Version : Radeon 9700 gamma corrected AA

08-19-2002, 06:01 PM
Looks like someone finally did antialiased sample weighting the right way! Has anyone used this, is there a second gamma factor or does it make an assumption about the monitor gamma response? How is the supsample gamma factor controlled?

"SmoothVision 2.0 - Now with gamma correction and Anisotropic Filtering for free

In addition, ATi has improved on their SmoothVision engine for Anti-Aliasing and Anisotropic Filtering. SmoothVision 2.0 utilizes a 2X, 4X or 6X Multi-Sample AA approach but also includes a new gamma correction technique. In addition to sampling jagged image pixels in a given scene, the Radeon 9700's SmoothVision engine also adjusts gamma correction for those samples when they are applied and it determines the best color uniformity for each pixel. ATi claims this will produce superior AA image quality compared to anything on the market. "

08-19-2002, 09:32 PM
thats sweet as well..:

.. about the fsaa and anysotropic. how it looks, well, we'll see (at least i, soon.. someday http://www.opengl.org/discussion_boards/ubb/biggrin.gif), but about the "for free":
well, there is a drop, but its quite okay imho. if you think about how much more samples are done perpixel...

[This message has been edited by davepermen (edited 08-20-2002).]

08-19-2002, 11:44 PM
I'm very impressed by the Radeon 9700. I have yet to find any weak spots on it. http://www.opengl.org/discussion_boards/ubb/smile.gif Massive performance lead, especially with AA and anisotropic. Not to talk about the quality of the AA. http://www.opengl.org/discussion_boards/ubb/smile.gif

I find these shots from the hothardware review quite telling:
GF4 4X (http://www.hothardware.com/reviews/images/Parhelia512test/jk2aashots/gf410244xaa.htm)

Parhelia 16X (http://www.hothardware.com/reviews/images/Parhelia512test/jk2aashots/parhelia102416xfaa.htm)

Radeon 9700 6X (http://www.hothardware.com/reviews/images/r9700/jkii6xaa.htm)

The difference really shows at the windows in the upper right corner which looks perfectly clean on the R9700 but kinda aliased on everything else. There's a slight difference in viewing angle though between the shots, but it shouldn't make that big difference.

Edit: Corrected 4x to 6x on the R9700 link.

[This message has been edited by Humus (edited 08-22-2002).]

08-20-2002, 12:05 AM
haha, gf4 4x looks CRAPPPPPPPPPPPPP http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-20-2002, 08:17 AM
I have yet to find any weak spots on it

Well, it'd be nice if the higher pixel precision lasted all the way through to the framebuffer. I'm pretty sure the it can only be 10 bits per component. I don't know if this is true about the 9700 (correct me if I'm wrong, it was hard to get exact info out of the ATI guy I talked to at Siggraph...I have at least two conflicting reports about 9700's precision from him) but I believe that sometimes cards with 10 bits per component do 10-10-10-2, so they skimp on alpha. I think Matrox Parhelia works like that.

Also, I think branches and loops in the programmable parts of the card would be nice.

Correct me if I'm wrong about any of this, though! I don't have the patience to wait for ati.com to load so I haven't spent much time reading anything official.

-- Zeno

08-20-2002, 08:27 AM
Originally posted by Zeno:
Well, it'd be nice if the higher pixel precision lasted all the way through to the framebuffer. I'm pretty sure the it can only be 10 bits per component.

Isn't it only 10bits/Channel on the RAMDAC? (with the framebuffer being the full 32bit float/channel)

08-20-2002, 08:29 AM
The floating point pixels don't need to survive all the way to the framebuffer. All calculations can be done with floating point by rendering to offscreen buffers ( dithered to 10 bits per component when the final image is to be displayed ).

Regarding the AA - I really don't see that big a difference in quality between the three cards. The GeForce4 has high quality AA and with moving images, who's going to notice small artifacts.

08-20-2002, 08:32 AM
haha, gf4 4x looks CRAPPPPPPPPPPPPP

I think you're being a bit harsh. I cant find any significant differences at all.

I think the windows look better on the GF4. The AA on the new radeon is alot better, but we dont know what type of AA they used on each card.


08-20-2002, 08:44 AM
Regarding the AA - I really don't see that big a difference in quality between the three cards. The GeForce4 has high quality AA and with moving images, who's going to notice small artifacts.

I agree. I think the whole anti-aliasing thing is overrated. I'll take anisotropic filtering over anti-aliasing anyday, and I'll even be able to notice it when I move around.

-- Zeno

08-20-2002, 11:33 AM
The only place I could really tell a difference is on the stairs. You get the classic example of aliasing on the other two cards, but the ATI looks better than I've ever seen. But, when somebody is coming after you with a gun like that, who has time to admire the stairs ^_^

08-20-2002, 12:46 PM
The underside of the gangway, where it contrasts with the sky, also shows rather worse for the GF4 than for the Parhelia or the 9700.

Also, the ceiling lights in the entranceway on the left look bad on both GF4 and Parhelia, but OK on the 9700.

On the other hand, what do you expect? The GF4 is a LAST generation card. I mean, you've been able to actually BUY it -- for MONTHS!

[This message has been edited by jwatte (edited 08-20-2002).]

08-20-2002, 12:53 PM
they have 96bits per pixel in the pixelshader => 24bits per component

they have a full 128bits per pixel framebuffer

and we could start the debate (==flamewar):

you're BLIND if you don't see the difference in the aa all over the image (edges http://www.opengl.org/discussion_boards/ubb/biggrin.gif)..

well, i see them, very good.. the others just look so jaggy..

but i know of much people who haven't even noticed ANTIALIASING at all..

its like the ones that can hear the difference between 320kbits mp3 and original cd http://www.opengl.org/discussion_boards/ubb/biggrin.gif.. (i can hear the lost quality! i can HEAR IT! http://www.opengl.org/discussion_boards/ubb/biggrin.gif)

08-20-2002, 01:35 PM
I had another look at the images and I can see the Radeon 9700 image looks far better that the others ( around the spherical base of the sculpture ).
Still, with animations I doubt anyone is going to notice the differences.

I can clearly see the difference between AA and no AA but I can't see the difference between 2x, 4x, quincux, etc when moving around. Maybe others can ?

Anyway, I think the 9700 is one excellent GPU with a lot of great features ( and still able to perform well ).

08-20-2002, 02:12 PM
Ok, I lined the images up in two different browser windows and flipped back and forth between them and I can tell now that the 9700 is much better at anti-aliasing.

I'm not sure if I could tell which card I was playing on, though, if I were to take a blind test without seeing the other one next to it.

One other interesting thing I noticed is that some of the coloration is different between the two pictures. For instance, if you look at the wall at the top right of the stairway, it is much darker in the 9700 image than in the gf4 image. Food for thought http://www.opengl.org/discussion_boards/ubb/wink.gif

-- Zeno

08-20-2002, 05:59 PM
FSAA sucks if you ask me. Each pixel gets the color from its neighbor, no matter what the algorithm is, and even if it gives the illusion of smoothness, the image is no longer what it was ment to be. It's kind of like jpeg compression.

It's much better to have *higher resolution screens* than FSAA. At the very least, they should attempt edge-AA

Try this if you don't see the difference. Take snap shot of the upper right picture of the guy. Go into photoshop or whatever, and substract.

What we really need is smaller phospors on the screen, more video memory and higher resolution (say 4000x4000). More colors (12 bit per component perhaps)


08-20-2002, 09:42 PM
well, vman, you're just plain wrong.. edges still get jaggy, and indepenend on resolution, the jaggyness is easily visible..

why does a dvd on the tv, like shrek, look far more detailed than any game on 800x600? its not the resolution.. its just that there is more detail perpixel..

and they don't get the pixels from the left, the right, etc.. they _DO_ render the screen in a higher res, and downsample it (directly, in hw, on my gf2mx its done in software => ultrahighresrendering, then sampled down with a box filter)..

so its the same..

the amount of pixels on the screen are enough.

08-20-2002, 11:01 PM
I think the fact that people are having to line images up side by side in browsers and squint, speaks volumes about the value of fsaa. It's an easy feature to implement, and is just something more to put on the box blurb.
The only time fsaa is valuable is when you're operating at low resolutions, in say a VR headset, when the jaggies are actually apparent. Most people operate at 1024x768 or above, and do not notice jaggies.
FSSA would certainly not even be in my equation when choosing my next card.
And I have good eyesight http://www.opengl.org/discussion_boards/ubb/smile.gif

08-21-2002, 12:32 AM
well if the images are animated, i normally see it even more..

scenes of quake are stupid to show it, take some modern scenes, with grass and trees with much leaves and everything. there you can see the aliasing even on high resolutions very much, if they are for example moving in the wind. (see the new cg demo of nvidia, without fsaa4x it looks terrible jaggy, even on 1024x768..)

thats why fsaa is here, to smooth edges. now q3 normally doesn't have much edges, but a lot of huge flat polygons. they need anysotropic filtering..

08-21-2002, 12:56 AM
Originally posted by knackered:
It's an easy feature to implement, and is just something more to put on the box blurb.

I wouldn't be so sure about that, as we have seen several different FSAA schemes from different vendors lately but all tend to have some kind of weakness, either not good enough quality, not good enough performance, compatibility problems or something else. The Radeon 8500 for instance has a serious performance hit by enabling FSAA, and the quality improvement isn't good enough to justify it, thus I've always left it off. While I've been able to survive so far without a good FSAA scheme it does get increasingly annoying looking at aliasing artefacts at all resolutions (yes, even at 2048x1536!). I'll certainly value a good FSAA scheme in my next video card purchase.

08-21-2002, 03:43 AM
I don't agree, so obviously it's very subjective.

Moshe Nissim
08-21-2002, 04:12 AM
Originally posted by V-man:
FSAA sucks if you ask me. Each pixel gets the color from its neighbor,
No. In true multisampled fsaa the samples aren't shared between neighbouring pixels. The infamous 'Quincux' breaks this rule (with bad impact on overall picture sharpness...).

The reason NV 4x AA looks bad is because the samples are arranged in a regular grid (parallel to the pixel grid), and therefore near-horizontal and near-vertical edges have quality that is same as 2x (2 sample) AA. (you can draw this on a piece of paper and prove it to yourself).

08-21-2002, 09:29 AM
Higher resolutions can never remove aliasing. All it does is help hide it. And that only works for certain types of aliasing.

Standard supersampling, which is just rendering at a higher resolution and scaling down, isn't going to get the job done either. The regular grid approach just doesn't work.

It seems that the Radeon 9700's approach is to sample from a distribution of points that have little-to-no correlation with each other. By doing so, their antialiasing actually has a chance of removing aliasing artifacts.

BTW, normal edge-antialiasing (as opposed to quincux, which only antialiases edges) requires that polygons be finely sorted back-to-front, and appropriately clipped if they intersect each other. You don't want that.

08-21-2002, 10:50 AM

I don't see what having a 32-32-32-32 backbuffer has to do with aliasing. The front buffer is still 8-8-8 or 10-10-10 (can anyone confirm if the ATI 9700 actually has a 10-10-10 front buffer mode? I heard it had a 10-bit RAMDAC) and if I draw a white triangle on a black background all that precision it does absolutely nothing to help the stair-stepping. Edge aliasing is caused on contrasting edges, and precision does not eliminate that.

08-21-2002, 03:55 PM
Sample summation with gamma correction makes a HUGE difference. If you don't agree then you need to do some research on this. It is an absolutely fundamental principal of computer graphics.

Your monitor has a non linear response to voltage, so in 2 sample antialiasing for example 50% coverage from (1.0 + 0.0) / 2.0 = 0.5, this will not LOOK like half brightness on your monitor though, it will look more like a quarter brightness, which is VERY WRONG, your antialiasing will suck You must set gamma correction in hardware to around 2.5 for most monitors to give a linear response and therefore produce correct sample weighting for current hardware. Most gamers play with hardware gamma of 1.0 so antialiasing looks crap.

The real bummer is that for reasons of human perception with an 8 bit framebuffer you NEED gamma correction at around 1 to give uniform increments in contrast sensitivity for a human viewer, so setting hardware gamma to around 2.5 introduces banding in darker areas that is unavoidable even if your artist tries to texture in that space or even if your game software has smart compensation in there.

The ATI announcement is fantastic for a couple of reasons, more precision in general allows gamma to be set high and avoid banding IF the content is there, but more impressively, (and I'm making some assumptions about their real capability here because they are annoyingly silent on this) you can I suspect set gamma in hardware to the value you want for the content, your monitor and human contrast sensitivity, and the AA samples will STILL be weighted correctly. i.e. instead of (1+0)/2 = .5 it sums to some other value that LOOKS like half brightness, so with gamma correct AA (1+0)/2 = .75 for example.

This should make it absolutely clear that gamma correct AA is essential for good antialiasing, and the lower the gamma correction value in hardware the more compensation is required in the sample weighting (and since most users have gamma of around 1 it will make an incredible difference).

All the blends etc still produce what appears like non linear brightness. Longer term what is required is more precision in the content (textures etc), rendering and sample summation in linear space (the way it is now) and gamma correction in hardware with enough precision to avoid the issues currently encountered with 8 bits. That's why high precision framebuffers and gamma tables are also interesting, but developers will have to USE this stuff and start setting hardware gamma high by default in games to get it right.

In the mean time ATI appear to have solved the problem because someone there 'gets it' and has finally put one dirty little secret of the graphics industry to rest. But remember I'm making some assumptions based on a bullet point that looks like they do the right thing.

I'm just wondering how you set the sample gamma with a 9700. To make this work in practice you need two gamma factors, one is the final gamma correction (which already exists on most cards) which can be set for content - user prefs - contrast sensitivity and another which is set to the absolute monitor gamma (or net diff between user and monitor gamma) and is used to weight samples. My question still stands, ATI how do you set the AA sample gamma factor or is it assumed that the monitor required correction of around ~2.4.

BTW looking at a screenshot on YOUR monitor over the web of AA with gamma may be a lost cause. Its not even clear if this feature is enabled or how to set the sample gamma or even if they do the right thing.

08-21-2002, 04:22 PM
Humus, that 9700 AA shot you posted is 6x not 4x.

08-21-2002, 07:23 PM
In all of my time with 3D graphics, I have never really understood Gamma correction/non-linear brightness. Do you know any good books/web-sites on the subject?

08-21-2002, 11:07 PM
Originally posted by dorbie:
Sample summation with gamma correction makes a HUGE difference. If you don't agree then you need to do some research on this. It is an absolutely fundamental principal of computer graphics.

Here we go again, so now we need to research why aa is a good thing....rather than using our eyes.
It's like the emperors' new clothes....
I just don't buy it - and I suspect consumers (ie. gamers) won't either.
If an effect isn't noticeable, then it's not worth the fill rate hit.
No offense, Dorbie, and I'm sure you can quote lots of white papers and renderman docs to try and persuade me it's a good thing, but we're talking about gamers here, who are not going to be watching these images on a cinema screen...(at least not for a long while http://www.opengl.org/discussion_boards/ubb/smile.gif ).

08-21-2002, 11:13 PM

there's this issue of Dr Dobb's Journal,

I wrote this for a 'simulation issue' and was prompted to emphasize simulation but it ended up in a graphics issue. In the article I tried to explain the issue with contrast sensitivity in particular and why this has the effect it does on precision, I haven't seen anyone else draw those precision graphs before or since.

In that I reference Andrew Glassner's two volume set "Principals of Digital Image Synthesis",
http://www.amazon.com/exec/obidos/tg/det...=books&n=507846 (http://www.amazon.com/exec/obidos/tg/detail/-/1558602763/qid=1030001475/sr=8-1/ref=sr_8_1/002-6842668-8197668?s=books&n=507846)

And there is Charles Poynton's book "A Technical Introduction to Digital Video".
http://www.amazon.com/exec/obidos/tg/det...=books&n=507846 (http://www.amazon.com/exec/obidos/tg/detail/-/047112253X/qid=1030001540/sr=8-1/ref=sr_8_1/002-6842668-8197668?s=books&n=507846)

It boils down to these points:

1) The linear range of values from your framebuffer between 0-1 do not display as a linear range of values on a CRT. In fact they are considerably darker over most of the range.

2) The correct presentation of images on a CRT must compensate for this inherent flaw in the display technology by either boosting the voltage on the wire to give uniform brightness, or by adjusting the digital values in an image such that the brightness is boosted. This is called "gamma correction".

3) The human visual system can detect smaller discrete increments in brightness in darker portions of an image than the lighter portions and the gamma curve of a monitor is useful in digital images (good ones) such that 8 bits of information is pretty good at presenting a non linear range of values than a human.

4) By an amazing fluke the non linear response of the monitor is pretty close to the response required to give a perceptually uniform brightness increment at each value in a linear scale for the human eye.

5) Post framebuffer hardware gamma correction while desirable for stuff like linear blending applications (and antialiasing) is actually bad for 8 bit framebuffers because your eye will see discrete value jumps in the shadows.

6) NO hardware gamma correction where software gamma correction has been applied to a high quality image to produce an 8 bit gamma corrected image is actually a GREAT way to display things like photos, which therefore have built in gamma correction already, but were generated in devices like digital cameras from 10 or 12 bit data.

7) Unfortunately hardware gamma correction of an 8 bit digital rendered image falls foul of 3) because you will see banding in dark regions. You need source data in high precision to gamma correct to an 8 bit gamma corrected perceptually uniform range.

So you can see the dilema, you want perceptually uniform rendering but computers don't draw that kind of stuff. Even with a deep framebuffer the source data needs to be sufficient precision.

But think about this for a sec. If you have an 8 bit image that has been generated by a camera for display on a PC it will have built in gamma to perceptually uniform space. You could un gamma correct to 12 bit linear space, render in linear space at high precision and gamma correct in hardware for great results with no banding and all the arithmetic is perfectly correct and the content is correct. The key is more precision in the framebuffer, but that's not going to work for legacy apps and frankly most graphics software developers have no clue about this :-( so even new apps are unlikely to improve. It also requires the user or game to set the cards hardware gamma, which makes the windows desktop look much brighter than normal.

In the mean time AA weightings would be wrong for existing and many new applications because they just won't do this. The solution? Render in gamma corrected space (or make the assumption that you do), ungamma correct the fragments sum in linear space and gamma correct the result to give accurate sample weightings for your display.

The dilema is how much do you un gamma correct and re gamma correct by. This depends on the hardware gamma setting and the display gamma. Ideally this subsample gamma correction factor should be the net correction between hardware and display gamma. Typically with a normal display and your average PC the fragments would have to be uncorrected with a factor of 2.4 summed and recorrected, to give correct weightings for the display.

Again I don't know for sure if this is what ATI opted to do, it is what I have advocated in the past, but you can see the assumption in that last sentence. The number I used is 2.4, that needs to be programmable. This needs to work for different displays, and it needs to be the NET of gamma correction and actual display gamma, with the assumption that the hardware gamma is insufficient for the display, (the motivation for doing this in the first place).

[This message has been edited by dorbie (edited 08-22-2002).]

08-21-2002, 11:25 PM
knackered, no, you need to research why gamma correct sample sum is a good thing for AA (actually understanding my preceeding post would do it). The fact that you think AA sucks is BECAUSE your card gamma correction is set to 1, and your monitor requires gamma correction at around 2.4.

Try this: run something with AA (a white poly on a black background would be ideal because the gamma won't affect the overall appearance of the image). Draw with AA and hardware gamma set to 1, then look at it with your gamma correction set to 2.5 in your advanced display settings. Then tell us what you think of AA. Now, wouldn't it be nice to get that effect on AA in all your game images without screwing up the overall image brightness in the mid-tones? Exactly!

I don't need to debate whether AA is a good thing. The rest of the graphics industry saw years ago that AA is essential to image quality. Feel free to catch up whenever.

The performance hit for AA is diminishing with the generation of cards designed for it. The point is if you are going to take that hit you should get the return on investment appropriate, what is asinine is turning on AA with all that fancy quincunx etc and ending up with the wrong sample weightings even after you take the hit.

[This message has been edited by dorbie (edited 08-22-2002).]

08-22-2002, 07:38 AM
Dorbie, great explanation. I simply could not see it before, but now its obvious. Thanks.

08-22-2002, 07:48 AM
Originally posted by dorbie:
Humus, that 9700 AA shot you posted is 6x not 4x.

Yeah, my mistake. Fixed it now ..

08-22-2002, 08:48 AM
No dorbie, you've missed my point. I'm not complaining about the quality of current AA techniques, I'm questioning whether or not it is neccessary to use AA at all, in any form.
You're a graphics programmer - it's your business to look for ways to improve images, you look closer than consumers at images. A gamer (which is who 'consumer' cards are aimed at) does not look this closely at polygon bounderies, and tends not to notice jaggies anymore, when operating at a reasonable resolution - go ahead, ask a gamer his opinion on aliasing artefacts, the silence will deafen you.
Feel free to catch up? You're very cheap Dorbie, and not a little anal.
Just wanted to clarify what I was actually saying - feel free to carry on having a conversation with yourself, Dorbie.

08-22-2002, 08:59 AM
Originally posted by davepermen:
well, vman, you're just plain wrong.. edges still get jaggy, and indepenend on resolution, the jaggyness is easily visible..

why does a dvd on the tv, like shrek, look far more detailed than any game on 800x600? its not the resolution.. its just that there is more detail perpixel..

and they don't get the pixels from the left, the right, etc.. they _DO_ render the screen in a higher res, and downsample it (directly, in hw, on my gf2mx its done in software => ultrahighresrendering, then sampled down with a box filter)..

so its the same..

the amount of pixels on the screen are enough.

True, they are not getting the final from left and right. I'm not saying that downsampling (these methods are called high resolution AA) doesn't have its merrits, but the problem is that they tend to give a dull look in some cases where there are brightly colored pixels surrounded by lesser ones.

Doing edges AA will reduce that.

My final point about increasing monitor and video resolution was about forgetting about AA methods completely. If you increase resolution high enough, your eyes (brain) will do the merging of pixels and as a bonus, you won't ne seeing jagginess. It would be like looking at a photograph.

I don't know about you guys, but I can see individual pixels (the phospors) on my monitor and the spaces between them.
Oh yes Shrek. It's hard to see the details since TV images (old ones anyway) are jumpy and the vertical resolution here is 525 lines I hear. Try watching on a computer. Big difference!


08-22-2002, 10:13 AM
Feel free to catch up? You're very cheap Dorbie, and not a little anal.

Anal or not, Dorbie's right. Antialiasing is essential to photorealistic results. Sure, it may not matter for current games, but things like OpenGL 2.0 and so forth aren't about current games. They're about the future. And the future is photorealistic graphics.

There's a reason that ATi put the effort into anisotropic filtering (which, btw, is a form of antialiasing) and other image-quality enhancing features. Because image quality is the future. Right now, you can run modern games on a 9700 at 1024x768 with good antialiasing and high-qualtiy visuals at acceptable framerates. No other card could do that.

In terms of gaming, this is nothing less than a revolution. Most gamers can't play a game at much more than 1280x1024 anyway (due to monitor limitations), so dropping down to 1024x768 to get a better image is hardly an unreasonable idea. And, regardless of whether they claim to notice aliasing or not, they subconsciously notice it. You'll never get photorealistic rendering (or anywhere close to it) without antialiasing.

08-22-2002, 11:54 AM
AA does make a lot of sense. With rotated grid multisampling, you can achieve drastically better image qualtiy and at a lower performance cost than by upping the resolution.
And that's the real kicker. 4xAA on a Gf4 won't do you much good, it's just a bad tradeoff (because it's ordered grid), but RGMSAA is IMO the best thing since sliced bread.

Anyone with a Gf4Ti can do a little experiment:

1)Fire up a game, any game, but set it to 800x600. Look around closely
2)Activate 2x AA, then fire up the game again, this time set it to 640x480

Now, which setting is better? You already know my opinion http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-22-2002, 12:28 PM

Screw gamers. The hardcore gamer is a halfwit who probably runs without lighting because he thinks 180fps is better than 120fps. Does that mean we don't need hardware lighting either.

With graphics heaven is in the details and it's the subtle stuff that makes all the difference. The stuff you don't notice until it's taken away.

Those of us in film, tv, science have been waiting for this stuff for way too long while the game guys get more polys that they can't even use.

Good on you ATI, genlock next please. And knackered, drag yourself away from your computer for a couple of hours and read a book. You might learn something.

08-22-2002, 12:36 PM
Originally posted by knackered:
A gamer (which is who 'consumer' cards are aimed at) does not look this closely at polygon bounderies, and tends not to notice jaggies anymore, when operating at a reasonable resolution - go ahead, ask a gamer his opinion on aliasing artefacts, the silence will deafen you.

Well, go visit a gamers forum like the ones at Rage3D or NvNews and you'll find that AA is a feature that is quite often talked about. Gamer do really request good AA quality these days. Generally speaken, gamers talk about speed, AA and anisotropic. The rest of the attributes of a card is batch together in vague titles like "DX8 card" or "DX9 card".

08-22-2002, 03:18 PM

Those of us in film, tv, science have been waiting for this stuff for way too long while the game guys get more polys that they can't even use.

How does a hardware accellerator help in film/tv production ? All of this stuff is rendered offline as far as I know, giving artists the freedom to choose the methods they need/want.

08-22-2002, 03:55 PM
How does a hardware accellerator help in film/tv production ? All of this stuff is rendered offline as far as I know, giving artists the freedom to choose the methods they need/want.

Yes, but now consumer-level cards are getting close to giving them what they need. With good antialiasing, good shader support, and a few other things, they can start speeding up production through hardware acceleration.

08-22-2002, 10:37 PM
knackered, I'm not being cheap. I said AA needs gamma correction, initially you misstated what *I* had said about what you need to research, but I did understand your claim. It was a bit of a non sequitur you saying the emperor has no clothes on AA but your entire tone required a suitable response. You imply that everyone who appreciates the benefits of AA is a fool running around pretending it's a good thing instead of looking for themselves, when the truth is that many industries rely on antialiasing for image quality and the difference is completely obvious, even to the casual observer. I can SEE aliasing, don't tell me it's not there. It is easy to think of cases where AA is vital even in games, think of lines on a runway in a flight sim game for example, or a distant target like a tank. Aliasing is an obvious distraction in these applications.

When we have card reviews online posting comparrisons of 2, 4, 6 & 16 sample AA I have some evidence that AA is seen as an important issue and quality is carefully tracked. When davepermen posts "geforce aa looks awful" bearing in mind even that is better than no aa, I can be confident that the benefits of good quality AA vs poorer AA are clear to others. When graphics cards are implementing AA with less and less of a performance penalty with each generation I think it is inevitable that this feature will become a main stream one (if it hasn't already).

Have you cranked up your gamma correction and looked at AA quality yet? Go to display properties->settings->advanced->color correction then evaluate AA. I think you may change your mind.

[This message has been edited by dorbie (edited 08-23-2002).]

08-22-2002, 11:19 PM
Humus, why would gamers demand better AA if they haven't ever seen proper AA?
Do they render white triangles against black backgrounds? Maybe Rez will benefit...

Dorbie, I agree that AA might help out on runway markings at oblique angles in a flight sim, but I have my own methods for reducing aliasing in those circumstances, which seem to work ok.

Really, it's subjective. The 9700 is not going to be rendering photorealistic images in its lifetime, but I agree that it's neccessary to make small steps to prepare for when graphics cards can produce photorealistic images, at which point AA would be the final smoothing touch.

henryj, 'screw gamers'? Buy a Wildcat, if you're a professional - these geforce/radeon cards are aimed at gamers (the mass market...consumers...etc). And don't make assumtions about my reading habbits, you don't know me. My point is not about how much one has read, it's about how much one can 'see'. Do you see?

My point is, I don't see the need for AA at the moment. With that, I shall leave you people to read your books on why that is a foolish opinion to hold.

08-23-2002, 12:01 AM
Yep there are all sorts of tricks to reduce aliasing in special circumstances, none are really as easy or generally applicable as FSAA. Looking at things from a gamers perspective rather than your developers perspective, the gamer can buy a card with AA and fix aliasing issues. He does not have the luxury of implementing a graphics hack in his favourite game as you suggest, only developers can do that. FSAA solves aliasing in all games and under all circumstances and it is getting cheaper. I'm still concerned you don't have an appreciation for AA in a general scene :-).

Go on try the gamma adjustment, humour me, let me know what you think.

P.S. you continue to imply that we gained our insight from books rather than practical experience. That is not the case, implying that we're somehow ivory tower theorists is just wrong. I play games, I write software and I experiment with anti-aliasing methods. I expect most posters here think that AA is desirable because they've seen it, not because they read in a book that it was good.

[This message has been edited by dorbie (edited 08-23-2002).]

08-23-2002, 12:34 AM
Look, most current games don't benefit so much from AA simply because they have no high frequency detail, just large textured polys. And aliasing of textures is taken care of by mip mapping. However, as triangle counts keep rising and games get more high frequency geometric detail you'll see swimming and aliasing even at high resolutions. Then AA helps. A lack of AA is really obvious in high end offline 3d-rendering, even to casual observers. especially when animating.

08-23-2002, 02:06 AM
Dorbie, I was refering to the "no you can see it if you line them up side by side in the browser" comments at the beginning of this thread...it echoed my own experience of my boss getting everyone to put their noses up to the monitor trying to ascertain whether GF3 4x AA was on or not....it just struck me as ridiculous.
I'll try your experiment when I get a minute this weekend, to honour you http://www.opengl.org/discussion_boards/ubb/smile.gif
AA *is* a life saver when using headsets, such as Kaiser etc. because the resolution can generally be no higher than 800x600 - it *is* worth enabling then.
Sorry for implying you were all in ivory towers - it's true that I was implying that.
Constant references to papers without acknowledging that it doesn't make a *dramatic* difference to image quality didn't help to convince me you weren't in ivory towers.

08-23-2002, 02:57 AM
yeah, that way you're okay knackered. its stupid to fight about things you don't see just because "its there".. like the loss in quality in hugekbits mp3 files.. you can't hear a difference but still most guys are complaining..

but on the other hand, i actually _CAN_ see the difference directly by eye, between the fsaa4x of the gf3 and the fsaa6x of the radeon. and its even bether than the pathelia aa, imho. i can see very good images without fsaa, compared to ones with fsaa of geforce classes, and _real_ fsaa, like if you use brazil to render in 3dsmax, or set up an own raytracer wich does 64samples per pixel.. and the images of brazil are by far the best, really smooth and detailed. so do the images of this radeon now look. smooth _AND_ detailed.

but still, you wont see much on a quakegame, as there aren't much edges. but if you have natural scenes, with high detail and tons of small triangles, then you see it.

and displacementmapping means you tesselate to aboutpixelsize triangles. there aa will be needed quite much, else movement of such objects get as jaggy, as if you have a nearest filtered texture.. *ouch*

08-23-2002, 03:14 AM
I said humour me, not honour me :-) Nobody referenced papers, I mentioned two books with info on gamma correction because someone asked.

I agree squinting at a monitor to decide implies that the benefit is marginal but the wrong gamma correction setting can defeat anti-aliasing which is what I've been saying. I've even seen monitors that require much more gamma correction than the average ~2.5 which would make it very difficult to see the benefits of AA.

All PCs have the wrong gamma correction setting by default for AA arithmetic (and infact the application/game content REQUIRES the wrong gamma for AA), and human perception makes the default PC gamma correction setting (i.e. none) good for other reasons.

I find it ironic that people compare anti-aliasing on cards where their gamma correction settings are completely whacky, nobody has good AA on PCs except perhaps ATI with smoothvision2. This is why I started this thread and why I think their AA *MIGHT* be uniquely interesting, it's not certain yet though.

Use this to measure your monitor gamma: http://www.cs.berkeley.edu/~efros/java/gamma/gamma.html

Then apply gamma correction in your video driver of the ammount measured. When you're done this applet should match at 1.0. Then your AA will be correctly weighted but your game content will look too bright.

Alternatively use the following chart and slide your hardware gamma setting until it matches at 1.0 then test your AA: http://www-graphics.stanford.edu/~ericv/gamma/gamma.gif

What I'm hoping is that you can leave the game content the same and have another subsample gamma adjustment somewhere on the 9700 which allows you to correct for sample arithmetic but leave the rest of the game content as the artist intended. As I described earlier, this requires you to do the equivalent of uncorrecting subsamples to linear display brightness space sum them and correct the samples back to nonlinear space. The erroneous assumption is that the samples are gamma corrected for the display already but that's just an expedient trick to make the subsample arithmetic work for display on your monitor. My question which still hasn't been answered is does a 9700 do this, is there an additional smoothvision 2 subsample gamma adjustment like this or is there an assumption made about actual display gamma?

[This message has been edited by dorbie (edited 08-23-2002).]

08-23-2002, 03:32 AM
When I look at the graphics in current games I see many things that need improving and AA is one of those but its quite a way down on the list. What strikes me most is per vertex lighting and low poly counts (Popping/geomorphing terrain, hexagon shaped wheels,angular heads, billboard trees, too few characters/models on screen at any one time)

With current AA solutions I have a hard time telling whether it's on or not. The 9700 may change my opinion but right now I play with it off.

08-23-2002, 03:36 AM
Originally posted by dorbie:
I find it ironic that people compare anti-aliasing on cards where their gamma correction settings are completely whacky, nobody has good AA on PCs except perhaps ATI with smoothvision2. This is why I started this thread and why I think their AA *MIGHT* be uniquely interesting, it's not certain yet though.

why? i dont need to know, as knacky as well, that it is a mathematical awesome approach to do something. it has to look good. and imho, the images wich we can look at look best on the radeon9700. why? well, i don't care for the choise of good quality (means, first i look if its good, then i look why it is good http://www.opengl.org/discussion_boards/ubb/biggrin.gif)

and it looks good because the edges are really antialiased, not simlpy one of 4 mixvalues, like they look on the gf3 or gf4 all the time..

08-23-2002, 03:53 AM

try my gamma+AA suggestion. IMHO gamma correction is MUCH more important than pure sample count to AA quality, I've done the tests, you should too, then you'd see the irony. It is ironic that people are fussy about AA quality but are largely unaware of the biggest impediment to good quality. That some turn on AA and don't see much benefit and dismiss AA as a feature. It's ironic that we have a world full of knowledgable graphics enthusiasts who don't see the value in an absolutely fundamental feature, because they have never really seen it done well due to default system settings and content.

08-23-2002, 05:00 AM
Originally posted by knackered:
Humus, why would gamers demand better AA if they haven't ever seen proper AA?

Gamers don't care about "proper AA", but they do care whether is looks good or not. The difference between various FSAA schemes are obviuos to many gamers.

08-23-2002, 05:16 AM
exactly. all i have to say is that image from the radeon looks great. if it can look even much more great with proper screensettings, cool. but yet that image is great..

and yes, the amount of samples _DOES_ count much. it depends if it looks smooth or jaggy, simply said. that _is_ true, once and for ever. but, the sum and division has to be done in the correct linear space, else the result ****s up and gets "unsweet". thats quite logical as well..

08-23-2002, 05:39 AM
Lack of AA is one of the most grating and horrible artifacts I see on the CG animation show 'Eye Drops' on Tech TV. The show presents CG animation done by amatures, and while I can get over low polygon counts that look like they could be handled by a GeForce 3 in real time and texturing that looks like they only used the materials that came with 3D Studio, the most amaturish looking ones are those that are not anti-aliased.

I also remember the days when the biggest flaw everyone was talking about was the lack of anti-aliasing on the Playstation 2. I what everone noticed was the Japanese developer's total lack of understanding the concept of mip-mapping because they where coming off the PS1 (all the solutions I ever heard for it where FSAA, but I think that was the game press not understanding either). So, gamers do care, even if they don't know what they are talking about ^_^

08-23-2002, 05:56 AM
Dave, I agree sample count is important, I didn't say it wasn't. I said gamma correct subsample arithmetic is more important, especially for high contrast edges. You can have a thousand samples in a perfect poisson disc with no gamma correction, but 6 well placed samples summed in linear brightness display color space will do a better job of antialiasing a high contrast edge.

08-23-2002, 06:25 AM

I tried the gamma applet and discovered that my gamma was set to 2.73. No wonder everything was so dark http://www.opengl.org/discussion_boards/ubb/smile.gif. I must admit, gamma correction is something I have on my list of things to look at. I know it's important for correct lighting but I have never heard of gamma-corrected AA ( until now ). This is similar to correct lighting, right ?

You could just ask ATI if this is the gamma-correct AA you hope it is ( they are very good at responding ).

08-23-2002, 06:25 AM
Can anyone explain why the shadows all look greenish on the Parhelia?

08-23-2002, 07:26 AM
PH, yes the reasons are similar to those for lighting and blending etc. Basically you want a linear response to brightness for your arithmetic. It the absence of that you need non linear arithmetic. I was kinda hoping that someone at ATI would have responded to my repeated questions in this thread. But I've been posting overnight, maybe they will reply when they wake up soon.

Your monitor is pretty dark yup. You might want to correct a little bit in general but don't totally correct it, it'll make all your web images & other content incorrect if you do, they shoot for PC & Mac platforms so the images are already gamma corrected in software.

There are ways of getting the best of both worlds in your 3D application, for example if you cared about AA quality you would un gamma correct your content (textures etc) back to linear brightness and apply hardware gamma correction from your 3D app. But the price you pay is banding in the shadows unless you have more precision than the usual 8 bits. These are tradeoffs for smart developers to make.

08-23-2002, 08:34 AM
The gamma correction for AA is designed allow the multisample resolve to work more correctly for the way applications are presently operating. As dorbie mentioned, the right thing to do is to render to a linear buffer and correct on the way to the screen, but with only 8 bits you will end up with unacceptable banding on the low end. The gamma correct AA resolve takes the non-linear frame buffer contents and maps them to a linear space, then resolves the pixel, then converts back to the non-linear space.

There are additional features that can be used to do an even better job with respect to proper gamma, but will they require support by the app.

I hope this helps clear up some of the questions.


08-23-2002, 09:12 AM
This is awesome. The question remains, does the conversion make an assumption about how much correction is required to get to linear space or is it controllable? Linear space is defined by monitor gamma response, but hardware gamma correction is not set to compensate for this response (by definition). So, there is nothing to tell the subsample AA what the required compensation to get to and from linear gamma is. It needs to be the net difference between hardware gamma and actual monitor gamma. I'm sure an assumption of around 2.5 is vastly better than nothing, I'm just curious, is there a user setting? I saw no gamma factor in the smoothvision 2 GUI interface where I would have expected this control.

08-23-2002, 02:50 PM
This is weird, I'm getting 3.24 consistently when I run the Java program. For the alternative, I can see that I need to go above 3.0

I though PC monitors needed between 2.2 and 2.6

I do see the effects on AA, but I have 4x on my geforce and perhaps I'm not seeing a huge benifit as Dorbie insists on.


08-23-2002, 04:02 PM
It is not a hard & fast rule that PC monitors require between 2.2 and 2.6, this is just a rule of thumb (and I think the 2.2 comes from a web graphics compromise with Apple system settings). Your monitor clearly requires more.

3.42, that's high but it's a fact that the higher your monitor gamma the more important gamma correction is for AA. Of course you have to set hardware gamma correction to around 3.4 to see good AA on your monitor for your test.
Without hardware gamma correction you'll get even worse antialiasing than most users out there. The higher your monitor gamma the less effective AA will be and that will affect your opinion of the feature.

Your monitor is a great example of why ATI needs an adjustable subsample gamma instead of assuming that all monitors have a gamma response of 2.5. I don't know how they determine where linear brightness space is, they need a value from somewhere.

Yes you will see a HUGE difference in AA between gamma & no gamma AA, but it is clear now that you need a 9700 card to do it well with legacy applications and not throw off your application content.

08-24-2002, 02:43 AM
dorbi. i bet you can simply set it somewhere with a slider in the driver settings. a testwindow, with blackwhite dots, and grey dots, and a slider to match them.

08-24-2002, 02:56 AM
i just have to say i _HATE_ gamma..

i've made a little bitmap, one part black and white dots like in chess, the other one setting grey to 127,127,127. and it is by no means the same grey if you compare them..

so i went into my settings and set up the gamma, till they matched. now i have a gamma of 1.4 (wich is not much compared to others i think http://www.opengl.org/discussion_boards/ubb/biggrin.gif), but now all looks so cheesy... how long will it take till we got rid of that old wrong standard? what you think? i want grey to be (white+black)/2.. but doing so on my pc ****s everything up.. and i can't even show it to you with a printscreen http://www.opengl.org/discussion_boards/ubb/biggrin.gif

btw, here is the file (as bmp, huge fat bmp http://www.opengl.org/discussion_boards/ubb/biggrin.gif) http://www.itstudents.ch/users/dave/free/files/gamma.bmp

try it yourself..

(btw, i found the gamma settings in the nview dialog http://www.opengl.org/discussion_boards/ubb/biggrin.gif dunno where you will have it..)

08-24-2002, 03:58 AM
Yep, the checker dither vs the half tone grey is a classic way to determine if gamma is set correctly. Did you follow the applet link or look at the greyscale image I posted earlier in the thread? It's slightly more sophisticated but it operates on the same principal. I think using a 1 pixel checker pattern is not ideal because of the monitor guns response to the high frequency fluctuation in voltage, monitors aren't perfect. I suspect that's why the applet uses lines. Same principal, different pattern.

Gamma complicates things (yep it is a mess) but some say it's a hidden blessing for 8 bit per component displays because of the issues with contrast sensitivity. I explain this in the Dr Dobb's article.

08-24-2002, 04:04 AM
ahh, dorbie, why did you said that? now my gamma is at 2.5 to be correct, with the applet.. :| now everything is about white.. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-24-2002, 04:15 AM
P.S. I'll take that bet, I bet ATI just make an assumption that display gamma is somewhere around 2, maybe 2.4. I assume that they at least subtract whatever hardware gamma and apply the net to get them to monitor linear color space i.e. the delta between hardware correction and monitor gamma. I think they SHOULD expose subsample gamma, but what I think they should do and what they do are two different things. As you can see you need 2 gamma factors not one unless you assume an absolute monitor gamma, and that kinda makes you recoil at first.

I just went looking for this thread: http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/006796.html

In my last two posts I suggested gamma compensated AA accumulation and I thought the response I remembered might have been from ATI saying make an assumption about monitor gamma but it wasn't, it was Dodger. But it shows that it's easy to recoil from the double gamma nad just assume a default, it will STILL be better than most other AA, just not adjustable to specific displays.

08-24-2002, 04:22 AM
So now you REALLY hate gamma right? :-)

08-24-2002, 06:18 AM
yes.. and i hate it because of two completely different things:

a) all is so ****ing bright..
b) grey != grey..

to a)

i've taken this picture, here: http://www.itstudents.ch/users/dave/free/files/normal.jpg
thats how it looks normally on a normal screen..

then i set my screen to correct gamma, according to the javaapplet, so that grey == grey on the applet with gamma = 1 there..

now it looked like this: http://www.itstudents.ch/users/dave/free/files/gammacorrected.jpg

so.. i had to reverse gamma correct the image, now it looks like this: http://www.itstudents.ch/users/dave/free/files/reversegammacorrected.jpg

the last image looks correct again, on my screen..

but well, for example flipcode you can't really read the text, as it is green on grey now http://www.opengl.org/discussion_boards/ubb/biggrin.gif terrible..

to b)

my test, with the chessboard types grey gave me correct gamma of 1.5.. but the javaapplet, wich has the same amount of black and white pixels, gave me 2.5.. that means, grey != grey, depending on how the pixels are aranged. that means, colors simply do **** up, more or less... thats annoying..

now, how do we get correct gamma settings, meaning how do we get it that a pixel with (.5,.5,.5) is half as bright as a pixel with (1,1,1), INDEPENDENT on the other pixels?

i think its impossible...

oh, btw, one other thing..

if you linearly interpolate normals on a normalmap, with billinear filtering, you get unnormalized vectors in between. this leads to darker regions, as the dotproduct gets smaller due smaller normal. now.. say the normal is only at 90% of the length, so the resulting dotproduct is only at 90% of the brightness.. without linear space (means with normal settings) this gets MUCH darker than only 10% darker..
now with gamma correct settings, and a linear space, as i have it currently, those unnormalized errors gets _MUCH_ smaller.. at least one cool thing http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-24-2002, 07:43 AM
> Can anyone explain why the shadows all
> look greenish on the Parhelia?

After you start getting your gamma right, you can start talking about color balance...

At first, when you adjust your gamma, you'll
think everything is soooo bright. But you
get used to it, and after a while, you'll
find yourself adjusting the display settings
of all your co-workers, because all their
screens are so obviously wrong.

Then it hits you: the flat gray square has
a bluish tint, but the white/black lines are
more brown. Aha! You need color correction!

Because graphics cards and monitors are all
different, you have to adjust the gamma of
each gun individually. I find that on my LCD
display, I need a gamma of 0.98 for blue, but
1.45 for red, and green somewhere in
between. Unfortunately, the nVIDIA gamma
control panel only gives me steps of 0.10 on
the >1 side, but steps of 0.01 on the <1
side. What's up with that?

Artists who care, actually buy little monitor
color measurement pick-ups, and adjust their
monitors until they are "perfect white". Or
as close as technology will get you, anyway.

08-24-2002, 09:50 AM
But you can also build a color profile into your software and adjust the values in your image when they are displayed. This is closer to the right thing to do when you only have 8 bits of precision in the framebuffer. You should not spend those bits linearly and monitor gamma helps you use them nonlinearly.

If you gamma correct in hardware then look at content designed for use on a PC it WILL look wrong, because it has built in gamma correction (most of it does anyway) or it has been artistically designed to look right with no gamma correction. If you use a system that's been designed from the ground up (including the content) to have hardware gamma correction then an 8 bit framebuffer is not enough.

Yup Dave, it's worse than you think.

08-24-2002, 10:23 AM
why the hell dorbie posted you about this? WHY?!?!!? http://www.opengl.org/discussion_boards/ubb/biggrin.gif

that whole system is more corrupt than the mafia, and there will be no way to get rid of it as long as _EVERYTHING_ works with the corrupted way.. damn.. i need two windows, one for using, and one for developing. one looks like normally, and one is correct..


no, really, how can we get rid of this? i mean, users could adjust the settings simple if for example at the installation you have some sliders to set that stuff up, with some references (gamma for example with the method of the java applet). but all stuff from today will mess up. no one wants that messed up..

any way to get reverse gamma correction working in the gdi? so that only windows drawed stuff gets pre****ed up, and recorrected by the real settings. and new stuff works okay then.. i mean, xp started to support really new windows rendering features, how would it be with xp wich uses correct gamma and simply corrects the old stuff automatically?..

any ideas to move to the future? i just think, with that wrong gamma, much math gets much more complicated, like image compression as well.. how to compress nonlinear space? starting with linear data is much more simple imho..

08-24-2002, 11:43 AM
Originally posted by dorbie:

Try this: run something with AA (a white poly on a black background would be ideal because the gamma won't affect the overall appearance of the image). Draw with AA and hardware gamma set to 1, then look at it with your gamma correction set to 2.5 in your advanced display settings. Then tell us what you think of AA. Now, wouldn't it be nice to get that effect on AA in all your game images without screwing up the overall image brightness in the mid-tones? Exactly!

im a bit slow, i tried the above + saw no difference between the images (with gamma + without)
is this the intended effect?

08-24-2002, 02:24 PM
Zed, no there should be a difference. The polygon edges should be very noitceably smoother with the gamma correction on (unless you have a 9700, which will do the right thing all the time and defeat the test if ATI are being honest).

Make sure you have a nice antialiased slope on your polygon edges and watch the screen as you slide your gamma setting in the GUI, you will see the edge magically smooth out as you approach the correct setting. Don't squint up close at the edge, sit back and observe the overall effect. When I show this to people their jaw drops. Antialiasing goes from simple intermediate tones on an edge that still looks jaggy to a genuinely smooth looking edge. Try the polygon displayed as you slide the gamma. Try it with 4 sample or quincunx AA or better, not just 2 sample.

Dave yes it's nasty :-) Welcome the the dark side of real-time computer graphics. Don't worry after a few years the nasty taste in your mouth will go away. Remember, it's not just AA that's displayed incorrectly, but all your lighting, blends and texture modulation arithmetic :-). What you are asking for is something like independent gamma correction per window, someone might actually do that, other vendors have planned this as a feature in the past.

08-24-2002, 09:33 PM

But, why would color correction effect a screen shot? I believe that color correction is a feature of the RAMDAC-CLUT and should not effect the contents of the frame buffer which has been saved. I was wondering why the dark areas have a greenish tent when displayed on my TNT Vanta at work, but the others have grey shadows. It seems like the Parhelia is actually screwing up the colors in the framebuffer, not on the screen.

In other words, why should the framebuffer of the Parhelia have a different color tint than the 9700 or GeForce 4 when displayed I view all the screenshots on a different computer?


I must say that that woman is really cute, post more examples of gamma correction like that. Much cuter than the classic image used to demonstrate color in computer graphics, the one in Foley & van Dams book (and certainly better than the baboon).

[This message has been edited by Nakoruru (edited 08-24-2002).]

08-24-2002, 11:39 PM
http://www.opengl.org/discussion_boards/ubb/biggrin.gif sweet girl, isn't she..

if there come some other tests, i sure find some other nice pics, useful for that test.. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

well, i now live in the the world of a linear color pc. features:
uses much less power, as brightness is not to max now, but to minimum.
perpixelbumpmapping errors due unnormalized vectors are much less visible, as a bit less bright than white does not mean near to black, but just a bit less bright http://www.opengl.org/discussion_boards/ubb/biggrin.gif
antialiasing looks a bit bether, at least, my fsaa4x on the gf2mx http://www.opengl.org/discussion_boards/ubb/biggrin.gif
FONTS ARE AWESOME GOOD TO READ! black text on white ground, like here in, looks near to cleartype of windows XP! its really good to read, and to code.
i think i don't get tired that fast, as the screen is not that bright anymore (i should get a flat, i know..)

gamma correction per window, would be awesome.. and the only way to dive into a future with correct colors, aint it? gammacorrection wasn't a big problem during the time we just had textured triangles, as there the textures could have been precorrected, and got mapped, and all looked okay.. but just at the moment lightmaps came, we all remember, they got damn dark http://www.opengl.org/discussion_boards/ubb/biggrin.gif and now, that complex pixelshaders are comming.. ohoh.. i think we need linear space..

Moshe Nissim
08-25-2002, 09:51 AM
Anti-aliasing exists not only in the 3D domain. It also has huge importance in font rendering. Does anyone know if TrueType anti-aliased font rendering takes 'under consideration' the current gamma setting? Is there any way to check?

08-25-2002, 10:29 AM
Originally posted by dorbie:
Make sure you have a nice antialiased slope on your polygon edges and watch the screen as you slide your gamma setting in the GUI,

doh! i was drawing a quad aligned with the window, thus there wasnt much (as in none) aliasing in the first place

08-25-2002, 10:55 AM
Originally posted by Moshe Nissim:
Anti-aliasing exists not only in the 3D domain. It also has huge importance in font rendering. Does anyone know if TrueType anti-aliased font rendering takes 'under consideration' the current gamma setting? Is there any way to check?

there are different ways of antialiased font. the _old_ way, wich is simple antialiasing and does not take gamma into account (at least, it doesn't look like this. now on gamma correct screen the font antialiasing works much bether).
second is cleartype. cleartype takes about everything in account that is possible, even that r,g, and b have different parts of a pixel => our screen is build up by 3 times the pixels. it uses that information to have a 3times as detailed font. and it looks awesome. i _think_ it uses gamma correction as well. its in windows xp, and you can enable it in the display settings somewhere. it is actually the main part i miss here in win2000..

the matrox pathelia provides exactly this. hardwareaccelerated true type font rendering (with cleartype technology?) wich takes gamma into account.

i don't know much info, but it should give you some hints..

08-25-2002, 01:01 PM
To answer Dorbie's question about a gamma slider, the GL driver presently always uses the same gamma value. This is obviously sub-optimal, but as you can see from the screen-shots the value used does pretty well on average.

FWIW, when I find out what info I can provide publicly, I will. This sort of improvement is obviously one we want developers to understand as it is most useful when the developer understands exactly what is going on.


08-25-2002, 09:23 PM
ehart, thanks for the info using the existing gamma setting makes only half sense, you need a target gamma assumption for the display. For example a gamma of 1 user setting would then mean full fragment gamma correction of around 2.5 but a user gamma setting of 2.5 would mean no fragment gamma correction. There is an implicit assumption of a default monitor gamma is 2.5 in my example that is always the case when using the user gamma setting. You cannot assume the user setting is correct. If you just use the user setting for fragment gamma correction you'd actually do more harm than good and undo attempts to linearize the brightness of fragments, what you need to use is the net difference between user gamma (in the current sense)and display gamma, which you cannot know without input.

[This message has been edited by dorbie (edited 08-25-2002).]

08-25-2002, 10:49 PM
Its the monitor's job to display correctly! The bloody monitors should come with built in gamma correction http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-25-2002, 11:22 PM
A gamma response of about 2.5 is actually desirable when you have an 8 bit framebuffer. See my earlier posts. If your monitor had a gamma response of 1.0 you'd take it back and tell the store it was busted.

08-26-2002, 05:26 AM
Even if the monitor does the gamma correction it still doesnt help now because of the manual correction that is done. So maybe we have the two modes with correction and with no correction http://www.opengl.org/discussion_boards/ubb/biggrin.gif.But how many use an 8-bit framebuffer? I had said that only because atleast then all the people will have gamma correction applied and then this pre-correction s*** will stop.
Maybe if they never precorrected at all.. the user would find the images looking pretty bad and would have set the gamma correction himself. Sigh!
Dorbie you have made me very restless with this thread http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-26-2002, 06:02 AM
yeah, its _SOOOOOOOOO_ annoying.. think about it, you install windows, and there comes up one sheet, with some grey images, and pixelized grey images, and some sliders. you have to set the sliders, in a way that it looks correct according to the description. but no, they just let the wrong startingsettings exist (they didn't even tried to find a good base setting, say 2.5..) resulting in billions of gigabytes of data wich is wrong stored. all images, all photos, all textures, all videos, all webpages, simply all is wrong now.. getting now rid of this will be hell, but is important and needed for the future, as all math on colors else ****s up..

its sooooooooo annoying..

i want perwindow gammasettings. that you can choose manually, that you use the user specified correct gamma, instead of the standart ones. that way, old applications would look the same, but new ones would have the ability to get linear color space, if the user set the stuff up correctly in its display settings. wich he will http://www.opengl.org/discussion_boards/ubb/biggrin.gif

08-26-2002, 07:12 AM
If the electron gun response has a large effect on the output image, then true color can never be acheived no matter where you implment the fix. Let's face it, CRT is not that great.

What about all the flat screen technologies out there. They any better or what? My screen looks like the flatscreens now with this gamma correction, which is alright I suppose but I'm still not used to this brightness!

I would like to bring it down to 2.4, where it was more comfortable and the AA actually looked better.

Dont forget to adjust your gamma in every game you have! There is one more thing that wasn't talked about: contrast and brightness. That's something to be decided on a per user basis or is there a method?


08-26-2002, 12:53 PM
Some games now adjust hardware gamma (Warcraft 3 for example) so you can't do a software un-gamma correction to compensate for your hardware gamma correction, so it's impossible to make the content look right with hardware gamma correction set high. Unless you have a very specific reason (like AA or linear lighting) I'd advise you don't set gamma correction to 2.4, it screws up all your content in many applications and causes banding even where your content is adjustable. If you are developing you need to be carefull about how you use this if you intend to look correct on other systems, although more is possible.

08-26-2002, 01:52 PM
To follow-up on the gamma stuff, the resolve is a relatively fixed function selectable to a couple values.

The resolve is presently done to look good on most systems with the gamma in the DAC set to linear or roughly linear. The resolve is compatible with the sRGB color space. This is an industry wide initiative to use a standard response.

For the gamma curve on a 9700, a programmer should generally leave the gamma to linear and allow the frambuffer to be a non-linear, effectively compressed, color space. This will provide best results with the multisample resolve.

Additionally, the 9700 can gamma adjust textures and the results of the pixel shader. These will be exposed by extensions shortly. The effect is that you can render to a compressed 8-bit per component color space and preform your lighting in linear space. Likewise, you can use textures from the compressed color space.


08-26-2002, 04:29 PM
Thanks Evan. So AA looks good with hardware gamma correction at 1.0 and only at 1.0 on 9700 but at ~2.5 and only at ~2.5 on other cards :-). I assume there's some kind of callibration to get sRGB to display correctly on individual displays, but this implies that the subsample gamma compensated 'resolve' would need to be flexible, even if it isn't exposed. Well however it's done it's probably impressive, now I want to see it in action :-).

[This message has been edited by dorbie (edited 08-26-2002).]

08-28-2002, 01:43 PM
I just wanted to post my view on AA, from a gamers perspective (I code 2..though).

I have a GF2MX, so AA is not really an option for me, I only enable it sometimes when playing against bots.

I find the aliased edges in Q3 very annoying, its so much more obvious when you play, because the popping pixels draw the eyes attention. So in that aspect I would want to enable it so I can focus better on whats important in a scene.

Also I think the game looks more "uniform" ie. geometry and textures blend toghether much better, it looks much more vivid.

IMO, of cause.