View Full Version : How to turn off antialiasing?

07-13-2003, 06:23 PM
I'm studying the researches of visibility culling. When I implemented ID rendering to find out the exactly visible objects in a given frame, I got a problem that the number of colors in the read-back color map is more than the drawn colors. If only one object is drawn with one color, eventually four colors will be in the color buffer. What I expected is at most two colors (foreground and background color). I only used the red component to tell between these objects (one color per object). The identifiers of objects are normailized in the range of [0, 1) and the objects are drawn with the normalized identifiers. I have examined the images, and it seems that the antialiasing has to be turned off. I have turned off the githering and antialiasing. However, the problem stll exists. Would anyone like to tell how to turn off antialiasing?

The codes of turning off antialiasing is as follows:


Thank you in advance.

07-13-2003, 08:12 PM
These are all off by default, however your graphics driver has an override option.

Go to Display->Properties->Advanced->OpenGL and select application default for AA, or if you really want to be sure turn it off, the exact interface you see will vary with the manufacturer and model.

07-14-2003, 09:16 AM
Some other advices:
- Check if you're rendering in 24 bit color.
- Use glShadeModel(GL_FLAT).
- Don't normalize your IDs, but use the pure glColor3ub(r, g, b).

With all these and your previous settings the color values in the framebuffer should be exactly what you sent.

07-15-2003, 03:48 AM
Did you use wglChoosePixelFormatARB() to request your pixel format?

07-15-2003, 04:19 AM
I have closed AA but still got four colors. My graphics card is based on the chipsets of NVIDIA GeForce 4 Ti 4200. I used the driver provided by NVIDIA whose version is

[This message has been edited by Isaac (edited 07-15-2003).]

07-15-2003, 05:27 AM
Thank you for your replying. I used the routine ChoosePixelFormat() to request a 24bit color buffer. As your suggestion, the flat shading is used. However, when I used glColor3ub(255, 255, 255) to draw a cube, I still got four colors: (0, 0, 0), (1, 1, 1), (255, 255, 255), (254, 254, 254). The two unexpected colors appeared on the upper and lower edges of the projection. Have anyone done the same thing (ID-rendering)?

Thank you in advance.

[This message has been edited by Isaac (edited 07-15-2003).]

07-15-2003, 06:15 AM
This shouldn't make any difference but it's worth a try. Try setting intellisample to quality if it isnt already.