PDA

View Full Version : Antialiasing



Mr_Smith
02-10-2002, 07:22 AM
Anyone know how i can enable full scene antialiasing on a kyro 2 using the GL_ARB_multisample extension and set the level eg 2x or 4x?

Mr_Smith
02-10-2002, 11:55 AM
Someone must know http://www.opengl.org/discussion_boards/ubb/frown.gif
its not rocket science well it wasnt b4 i started looking into it but now im not so sure http://www.opengl.org/discussion_boards/ubb/smile.gif

wedge
02-10-2002, 10:27 PM
http://developer.nvidia.com/view.asp?IO=ogl_multisample

the Paper
"OpenGL Pixel Formats and Multisample Antialiasing"
is very good to understnad.
wedge

Mr_Smith
02-11-2002, 12:41 AM
Looks good but arnt too sure how to use it, im currently doing this

static PIXELFORMATDESCRIPTOR pfd= {sizeof(PIXELFORMATDESCRIPTOR), 1, PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER, PFD_TYPE_RGBA, bits, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 16, 0, 0, PFD_MAIN_PLANE, 0, 0, 0, 0 };

if (!(hDC=GetDC(hWnd)))
if (!(PixelFormat=ChoosePixelFormat,hDC,&pfd)))
if(!SetPixelFormat(hDC,PixelFormat,&pfd))
how would i initialize the sample buffers and set up the number of samples?

tcobbs
02-11-2002, 07:46 PM
The article posted above goes through the process step-by-step, and includes all the necessary information. I have only tried it out on an nVidia-based video card, but the article does include all the information necessary; it was my sole source for adding FSAA support to a program I wrote, and it works fine.

Unfortunately, there are a significant number of changes from the standard initialization method, since you can't use the PIXELFORMATDESCRIPTOR method of setting up your pixel format. (As I'm sure you noticed, it doesn't provide any mechanism of requesting extra samples.) Please read the article carefully; skimming will likely result in missing something important.

--Travis Cobbs

Mr_Smith
02-12-2002, 04:50 AM
I tried the code but got no where http://www.opengl.org/discussion_boards/ubb/frown.gif <- why does it make it look angry, im only fed up

i dont think the Kyro 2 supports WGL_ARG_PIXELFORMAT, if its not in the extensions list it doesnt support it does it, or is it part of the std opengl stuff?

[This message has been edited by Mr_Smith (edited 02-12-2002).]

richardve
02-12-2002, 05:33 AM
Look at the name.. it's a WGL extension, not a GL extension..

tcobbs
02-12-2002, 07:22 AM
Originally posted by richardve:
Look at the name.. it's a WGL extension, not a GL extension..

To be a little more explicit, you have to use wglGetProcAddress to get the address of the wglGetExtensionsStringARB function, and then examine the WGL extension string returned by THAT function to determine whether or not your hardware supports the WGL_ARB_pixel_format extension. This is also in that document, along with the fact that you have to do all this AFTER creating your initial OpenGL window in order to get an OpenGL-compatible HDC to pass into the wglGetExtensionsStringARB function.

Also, while the document is fairly long, it's that long for a reason. The WGL_ARB_pixel_format is just not a trivial thing to throw into your program. If it were trivial, I'd just list the steps. On page 14, it lists all the steps you have to perform (filling the page), but you have to refer back to the previous 13 pages in the document for detailed instructions on how to perform each step.

--Travis Cobbs

wedge
02-12-2002, 11:28 PM
Like the paper tells you have to do a dymmy window to get the ProcAdress of the extension. This is essential. I don't like it, but it is the best way, isn't it?

Mr_Smith
02-13-2002, 06:12 AM
I did read it fully http://www.opengl.org/discussion_boards/ubb/frown.gif
just when i tried the command they specified it didnt work and im not too good with extensions, does anyone know a good tutorial?

Chris_S
02-13-2002, 09:25 AM
I am in the same boat as Mr_Smith, and I am just as frustrated. I read the Pixel Format paper and it leaves a LOT of missing information out. I REALLY would like a response from you guys out there who got this to work.

I have several specific points and questions:

(1) For the record, I'm using W2K with a Nvidia GeForce2/MX400 (32MB RAM).

(2) If I use the FSAA mode in Display | Settings and manually turn the AntiAliasing Off/On -- it works from the ControlPanel. Then I set it for "Allow programs to control AA" and the fun begins...

(3) I added the wglGetExtensionsStringARB string to the extensions, and I do get WGL_ARB_extensions_string, WGL_ARB_multisample, WGL_ARB_pbuffer
WGL_ARB_pixel_format, etc. all in the ext string. That indicates everything should work.

(4) I setup a temp window to init the proc adr's, I do everything the paper says, but my scenes don't get rendered with antialiasing. And I get no AV's or errors.

(5) But I have a question about these crazy tokens and their values. There are two very similar named token pairs:

GL_SAMPLE_BUFFERS_ARB = $80A8;
GL_SAMPLES_ARB = $80A9;

WGL_SAMPLE_BUFFERS_ARB = $2041;
WGL_SAMPLES_ARB = $2042;

If I use the "GL_" versions, I get a PixelFormat, but there is no AA active.
If I use the "WGL_" values I get ZERO PixelFormats returned from the wglChoosePixelFormatARB() call. I am simply setting ...SAMPLE_BUFFERS_ARB=1 and ...SAMPLES_ARB=4 in both cases.

(6) Another question, what are these tokens used for? They are not mentioned anywhere. Should these be invloved somehow?

MULTISAMPLE_ARB = $809D;
MULTISAMPLE_BIT_ARB = $20000000;

(BTW is "Multisample" talking about the same thing as "Sample_Buffers" ????? I assume it is).

(7) Is it required to set the Coverage params? I took this to be an option.

(8) Another thing, I tried out the Nvidia PixelFormat listing program (PxlFmt.Exe), and all the 57 formats that pop out, all have the AA and AAS columns with '0'. Doesn't that mean that there are no MultiSample Buffers available?

(9) I ran my own listing with wglGetPixelFormatAttribivARB() just like the paper describes, and I get the same identical results to the Nvidia program. I'm beginning to think the Nvidia guys couldn't get this working either in their program.

What do these results mean? Are there Multisample PixelFormats or aren't there? These listings say NO. But yet, the ControlPanel FSAA works, and the wgl extension string says it has MultiSample modes. This makes NO sense !!!!!


IMHO, this multisample area of OpenGL is an outright mess. Extremely confusing with thin documentation.

I hope someone out there can really help.

Thanks, Chris.

Diapolo
02-13-2002, 04:57 PM
I tried to read the mentioned NVIDIA Doc, too, but I have some questions, too.

WGL_ARB_extension_string -> needed and itīs clear to me how to use it

WGL_ARB_pixel_format -> needed and I know how to initialize it, but my question is, if I chose a pf with that Extension, I donīt need the good old PFD, or?

WGL_ARB_multisample -> Am I right, if I assume, that this WGL Extension consists only of 2 new tokens, or are there even new functions?

WGL_SAMPLE_BUFFERS_ARB 0x2041
WGL_SAMPLES_ARB 0x2042

GL_ARB_multisample -> Guess I have to use glEnable(GL_MULTISAMPLE_ARB); to enable the AA, after I chose the right pixelformat, is that right?
Or is the AA active, after I chose an AA capable pixelformat and I donīt need GL_ARB_multisample?
(By the way, the latest GLEXT.h in the NVSDK Beta contains an obsolete function called glSamplePassARB, that is not listed in the paper I got in SGIs Extension registry ... time for a brand new one http://www.opengl.org/discussion_boards/ubb/smile.gif Cass?)

Another thing that really anoys me, is, that the Extension registry seems a bit obsolete (many new NV Exts are missing and some docs are obsolete)?

Diapolo

[This message has been edited by Diapolo (edited 02-13-2002).]

Diapolo
02-13-2002, 05:25 PM
>(4) I setup a temp window to init the proc >adr's, I do everything the paper says, but >my scenes don't get rendered with >antialiasing. And I get no AV's or errors.

I guess you have to enable the AA, but Iīm not really sure about that.
Try: glEnable(GL_MULTISAMPLE_ARB);
You have to init the whole extension and the tokens first.
Would be cool if you can report back, after you tried it (I still have to code the stuff for WGL_ARB_pixel_format).

>(5) But I have a question about these crazy >tokens and their values. There are two >very similar named token pairs:

>GL_SAMPLE_BUFFERS_ARB = $80A8;
>GL_SAMPLES_ARB = $80A9;

>WGL_SAMPLE_BUFFERS_ARB = $2041;
>WGL_SAMPLES_ARB = $2042;

>If I use the "GL_" versions, I get a >PixelFormat, but there is no AA active.
>If I use the "WGL_" values I get ZERO >PixelFormats returned from the >wglChoosePixelFormatARB() call. I am simply >setting ...SAMPLE_BUFFERS_ARB=1 >and ...SAMPLES_ARB=4 in both cases.

I think the GL versions are not for the WGL_ARB_pixel_format function, but for some glGet functions.

---

Accepted by the <pname> parameter of GetBooleanv, GetDoublev, GetIntegerv, and GetFloatv:

SAMPLE_BUFFERS_ARB 0x80A8
SAMPLES_ARB 0x80A9
SAMPLE_COVERAGE_VALUE_ARB x80AA
SAMPLE_COVERAGE_INVERT_ARB 0x80AB

---

Accepted by the <piAttributes> parameter of
wglGetPixelFormatAttribivEXT, wglGetPixelFormatAttribfvEXT, and the <piAttribIList> and <pfAttribIList> of wglChoosePixelFormatEXT:

WGL_SAMPLE_BUFFERS_ARB 0x2041
WGL_SAMPLES_ARB 0x2042

---

I got that from the SGI doc about multisample (and they are talking about the EXT version, not the ARB one -> obsolete). http://oss.sgi.com/projects/ogl-sample/registry/ARB/multisample.txt

>(6) Another question, what are these tokens >used for? They are not mentioned >anywhere. Should these be invloved somehow?

>MULTISAMPLE_ARB = $809D;
>MULTISAMPLE_BIT_ARB = $20000000;

The first one is for glEnable(GL_MULTISAMPLE_ARB); and the second one should be for glPushAttrib(GL_MULTISAMPLE_BIT_ARB);

Hope I didnīt write too many wrong things, but Iīm in the phase of understanding the whole stuff about AA / multisample http://www.opengl.org/discussion_boards/ubb/wink.gif.

Diapolo

[This message has been edited by Diapolo (edited 02-13-2002).]

tcobbs
02-13-2002, 05:47 PM
You have to set WGL_SAMPLE_BUFFERS_ARB to GL_TRUE (1) to indicate that you want sample buffers. You have to set WGL_SAMPLES_ARB to the number of samples per pixel that you want. 2x AA uses 2, 4x uses 4, etc.

So, once you've performed your extensions check, and set everything up using the temp OpenGL window, your code should look something like this (note that this only supports 2x and 4x):




int myChoosePixelFormat(HDC hdc, BOOL use2xAA)
{
int retValue = -1;
GLint intValues[] = {
WGL_SAMPLES_EXT, 4,
WGL_DRAW_TO_WINDOW_ARB, GL_TRUE,
WGL_DOUBLE_BUFFER_ARB, GL_TRUE,
WGL_SAMPLE_BUFFERS_ARB, GL_TRUE,
WGL_SUPPORT_OPENGL_ARB, GL_TRUE,
WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,
WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB,
WGL_COLOR_BITS_ARB, 16,
WGL_DEPTH_BITS_ARB, 16,
0, 0
};
GLfloat floatValues[] = { 0.0f, 0.0f };
int index;
int count;

if (use2xAA)
{
intValues[1] = 2; // 2x instead of 4x
}
if (wglChoosePixelFormatARB(hdc, intValues,
floatValues, 1, &amp;index, &amp;count) &amp;&amp; count)
{
retValue = index;
}
return retValue;
}


Once you have used the above to get your index, you do the rest using the standard WGL functions (unless you want to examine the extended attributes of the pixel format you got).

If you have a GeForce 3 or above and want QuinCunx, you do the following AFTER your OpenGL rendering context has been created and made current:



glHint(GL_MULTISAMPLE_FILTER_HINT_NV, GL_NICEST);

Note that this will work in 4x mode as well, and give you a QuinCunx-style sample pattern in 4x mode.

--Travis Cobbs


[This message has been edited by tcobbs (edited 02-13-2002).]

Chris_S
02-13-2002, 09:50 PM
Thanks for the reply Travis. Your code is basically the same thing I was already doing. However in any case I pasted your exact attribute code into my module and ran it. I got the same thing as before: ZERO PIXEL FORMATS returned.

This is the same thing that the listings from wglGetPixelFormatAttribivARB() are saying: there is no AA PixelFormat available. When I run the Nvidia PxlFmt.Exe program I get the same thing again -- none of the 57 formats has TRUE in the AA column.

Here's the list if you want to see it:

OpenGL Pixel Formats, Version 1.3.0
Nvidia GeForce2/MX400
========================================
Indx Type Accl Colr Dpth Sten AA AAS
========================================
1 RGBA Full 32 16 0 F 0
2 RGBA Full 32 24 0 F 0
3 RGBA Full 32 24 8 F 0
4 RGBA Full 32 16 0 F 0
5 RGBA Full 32 16 0 F 0
6 RGBA Full 32 24 0 F 0
7 RGBA Full 32 24 0 F 0
8 RGBA Full 32 24 8 F 0
9 RGBA Full 32 24 8 F 0
10 RGBA Full 32 16 0 F 0
11 RGBA Full 32 16 0 F 0
12 RGBA Full 32 24 0 F 0
13 RGBA Full 32 24 0 F 0
14 RGBA Full 32 24 8 F 0
15 RGBA Full 32 24 8 F 0
16 RGBA None 32 32 8 F 0
17 RGBA None 32 16 8 F 0
18 RGBA None 32 32 8 F 0
19 RGBA None 32 16 8 F 0
20 RGBA None 32 32 8 F 0
21 RGBA None 32 16 8 F 0
22 RGBA None 32 32 8 F 0
23 RGBA None 32 16 8 F 0
24 CIDX None 32 32 8 F 0
25 CIDX None 32 16 8 F 0
26 CIDX None 32 32 8 F 0
27 CIDX None 32 16 8 F 0
28 RGBA None 24 32 8 F 0
29 RGBA None 24 16 8 F 0
30 RGBA None 24 32 8 F 0
31 RGBA None 24 16 8 F 0
32 CIDX None 24 32 8 F 0
33 CIDX None 24 16 8 F 0
34 RGBA None 16 32 8 F 0
35 RGBA None 16 16 8 F 0
36 RGBA None 16 32 8 F 0
37 RGBA None 16 16 8 F 0
38 CIDX None 16 32 8 F 0
39 CIDX None 16 16 8 F 0
40 RGBA None 8 32 8 F 0
41 RGBA None 8 16 8 F 0
42 RGBA None 8 32 8 F 0
43 RGBA None 8 16 8 F 0
44 CIDX None 8 32 8 F 0
45 CIDX None 8 16 8 F 0
46 RGBA None 4 32 8 F 0
47 RGBA None 4 16 8 F 0
48 RGBA None 4 32 8 F 0
49 RGBA None 4 16 8 F 0
50 CIDX None 4 32 8 F 0
51 CIDX None 4 16 8 F 0
52 RGBA Full 16 16 0 F 0
53 RGBA Full 16 16 0 F 0
54 RGBA Full 16 16 0 F 0
55 RGBA Full 0 16 0 F 0
56 RGBA Full 0 24 0 F 0
57 RGBA Full 0 24 8 F 0
========================================

That's all the PixelFormats available, and none have AA=True. Obviously wglChoosePixelFormatARB() is going to return ZERO when I ask for a format with AA. That's exactly what's happening.

I could easily believe that perhaps my card does not support this, but if so then why does the WGL extension string include all the proper tokens? This is what makes no sense to me.

What video card do you have? I'd really like it if you could run the Nvidia test program and tell me if you see any AA columns in your listing with TRUE. Here's a link so you can go right to it:
http://developer.nvidia.com/view.asp?IO=ogl_multisample

I'm pretty convinced now that there is nothing wrong with my code for setting up the PixelFormat. The problem seems to be that there just aren't any formats with AA available. Why? I guess that's the big question.

I am wondering if there is some other "flag" or bit I need to set to enable multisample modes. I can't find any info or doc on that.

I'm at a loss to know what to try next when the Nvidia listing program itself says there are no AA formats.

HELP!

Thanks, Chris.

Chris_S
02-13-2002, 10:07 PM
Thanks Diapolo for your response. You mentioned the glEnable(GL_MULTISAMPLE_ARB). That does enable/disable AA once it's working, but you have to get the context setup and working first. There's no way I can call that until I get at least 1 PixelFormat that I can setup first. (That's suppose to be ON by default anyway.)

You gave another interesting point about glPushAttrib(GL_MULTISAMPLE_BIT_ARB). I have not used the glPushAttrib() before. Now that sounds like the kind of thing that might be the "flag enable" bit I think is needed. Can I make that call before a rendering context is active?

Something has to be changed before I make the call to wglChoosePixelFormatARB() or there still won't be any PixelFormats available. What calls can be made prior to an active rendering context? Could it be an environment variable or something like that?

Thanks, Chris.

wedge
02-13-2002, 10:26 PM
I thing we had the same problem on the GeForce2/MX. I dont think it works on that Card!
You have to check for two extensions:
GL_ARB_multisample AND! WGL_ARB_multisample
I think one is missing at the gf2/mx.
wedge

Chris_S
02-13-2002, 10:43 PM
Thanks for the response. You may have just hit the nail on the head. Here are the actual extension strings for the regular and wgl extensions:

GL_ARB_imaging GL_ARB_multitexture GL_ARB_texture_compression GL_ARB_texture_cube_map GL_ARB_texture_env_add GL_ARB_texture_env_combine GL_ARB_texture_env_dot3 GL_ARB_transpose_matrix GL_S3_s3tc GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_minmax GL_EXT_blend_subtract GL_EXT_clip_volume_hint GL_EXT_compiled_vertex_array GL_EXT_draw_range_elements GL_EXT_fog_coord GL_EXT_packed_pixels GL_EXT_paletted_texture GL_EXT_point_parameters GL_EXT_rescale_normal GL_EXT_secondary_color GL_EXT_separate_specular_color GL_EXT_shared_texture_palette GL_EXT_stencil_wrap GL_EXT_texture_compression_s3tc GL_EXT_texture_edge_clamp GL_EXT_texture_env_add GL_EXT_texture_env_combine GL_EXT_texture_env_dot3 GL_EXT_texture_cube_map GL_EXT_texture_filter_anisotropic GL_EXT_texture_lod GL_EXT_texture_lod_bias GL_EXT_texture_object GL_EXT_vertex_array GL_EXT_vertex_weighting GL_IBM_texture_mirrored_repeat GL_KTX_buffer_region GL_NV_blend_square GL_NV_evaluators GL_NV_fence GL_NV_fog_distance GL_NV_light_max_exponent GL_NV_packed_depth_stencil GL_NV_register_combiners GL_NV_texgen_emboss GL_NV_texgen_reflection GL_NV_texture_env_combine4 GL_NV_texture_rectangle GL_NV_vertex_array_range GL_NV_vertex_array_range2 GL_NV_vertex_program GL_SGIS_generate_mipmap GL_SGIS_multitexture GL_SGIS_texture_lod GL_WIN_swap_hint WGL_EXT_swap_control

WGL_ARB_buffer_region WGL_ARB_extensions_string WGL_ARB_multisample WGL_ARB_pbuffer WGL_ARB_pixel_format WGL_EXT_extensions_string WGL_EXT_swap_control

The WGL string is there, but the GL string is NOT. Now I know why I feel like I've beating my head against the wall!

What still does not make sense is that the AA mode works in the ControlPanel, and they also show the option for "Control AA by Program". I guess the programming interface is broken on this model. (Thanks's a lot Nvidia.)

Wedge, since you seem to be on top of this problem, can you give me the names of a couple models that you know actually work? I've got to use something to finish my development with.

Thanks, Chris.

wedge
02-14-2002, 05:48 AM
"What still does not make sense is that the AA mode works in the ControlPanel, and they also show the option for "Control AA by Program". I guess the programming interface is broken on this model. (Thanks's a lot Nvidia.)"

In my opinion the gf2 has supersampling AA ??

We work on a gf3 cards. Don't know wich. Look at nvidia wich suport this. To use Quincunx you need a new driver version!
with the newest works.

Mr_Smith
02-14-2002, 05:58 AM
Has anyone got a sample program i can try as im having trouble getting this to work on the Kyro 2, and im not sure if its me or the card?

Diapolo
02-14-2002, 04:20 PM
OK, I guess Iīm very close to get it working, but I have got one Problem left.

I use Borland C++ Builder 5 and itīs forms.
I only have got one form for the OGL program.

Now if I setup a temporary GLRC in order to initialize the WGL functions, that works.
But if I try to call SetPixelFormat a second time (after using wglChoosePixelFormatARB) with the Pixelformat index, that IS capable of AA (checked the index and itīs values with the NVIDIA Pixelformat App), the call fails.

I think the NVIDIA doc mentions this in some way (when you try to use the Dektop Window hWnd an hDC).

So I guess I have to setup a REAL new window, in order to retrieve itīs hWnd and hDC first, then initialize the WGL functions and after that the second call of SetPixelFormat should work.

But my question is (guess for many people here a pretty simple one *g*):

HOW do I create a temp Window?
I guess I have to use CreateWindow or CreatWindowEx function, but I NEVER had to use them, becaus I use Borlandīs forms.

Any very easy samples for that (I know the NeHe Tuts use the CreateWindowEx function, but SO MANY lines of code for "ONLY" a new window)?

Please HELP me http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

[This message has been edited by Diapolo (edited 02-14-2002).]

Chris_S
02-14-2002, 04:51 PM
Hi folks, I'm back again. This just keeps getting more frustrating all the time.

I just bought a new GF4/MX400 ($139). Of course I thought that any GF4 board must have the multisample programming support working by now. Wedge said that he has it working on GF3 boards.

Well guess what, I run the Nvidia PxlFmt.Exe program and it shows NO AA PixelFormats again! My code says the same thing. So this still is no working.

I also noticed that as wedge mentioned that this GF4/MX board does NOT have the GL_MULTISAMPLE in the ext either!

I am beginning to think that Nvidia does not support the programming interface for the multisample mode on ANY mx board. I'm going to take this GF4/MX board back and get the GF3/TI ($199) instead. Now I know why it costs more money.

My advice to you guys trying to get this working as well, is to download the Nvidia PxlFmt.Exe program right away and test your cards first (I gave the link above). If you don't see any PixelFormats with '1' in the AA column -- your card is not going to work. You better find out if your card is even capable of offering these pixel modes.

To Diapolo:
Yes, you MUST use a temp wnd. Once you set a pixel format to a window, it won't let you change it. That's why your second call failed. I saw the same thing.

I don't use the Win API CreateWindow procs. However you need the message pump/handler from the main application. It's best just to create a form during runtime. I use FormTmp= TForm.CreateNew(). You don't need to actually show the form. After you do the createnew, then you can start using the window Handle from it to get your DC setup and then the rendering context. Grab your pointers to the wgl procs, then deactivate the contaxt, delete the context, and then delete the FormTmp. From there you can setup the real window with the new wgl calls for AA formats.

I've got the code written for that now, and at least that's what I expect to see happen whenever I get a board that actually gives me AA pixel formats!

Bye.. Chris.

tcobbs
02-14-2002, 06:38 PM
Originally posted by Chris_S:
I just bought a new GF4/MX400 ($139). Of course I thought that any GF4 board must have the multisample programming support working by now. Wedge said that he has it working on GF3 boards.

Well guess what, I run the Nvidia PxlFmt.Exe program and it shows NO AA PixelFormats again! My code says the same thing. So this still is no working.

I also noticed that as wedge mentioned that this GF4/MX board does NOT have the GL_MULTISAMPLE in the ext either!

I am beginning to think that Nvidia does not support the programming interface for the multisample mode on ANY mx board. I'm going to take this GF4/MX board back and get the GF3/TI ($199) instead. Now I know why it costs more money.


Well, unfortunately, nVidia listened to their marketing people instead of ignoring them when they were coming up with names for the new inexpensive boards. The GF4/MX is closer to a GeForce2 than a GeForce3, much less a GeForce4TI.

However, while I haven't tried the PxlFmt.exe program you mention, I have successfully produced FSAA windows using my GeForce 3 in my own program (using the info in the nVidia doc). So it does definitely work using a GeForce 3 (at least with the 23.11 drivers, anyway).

One thing I did notice, is that program crashes when using an FSAA window seem significantly more likely to cause the driver to get corrupted, requiring a reboot. In fact, on my computer at least, I occasionally have to power down. Rebooting doesn't always fix the problem.

--Travis Cobbs

dorbie
02-14-2002, 09:20 PM
Well it looks like we have one of the first of what could be many dissatisfied NVIDIA customers.

GF4 MX may yet be renamed Dan Vivoli's folly, but what was the alternative? It's not even a GF3, so you'd have had GF4 on the high end and GF2 uber-ultra-ptang-ptang at the low end.

It's still a nice card, but MX may be too cryptic a label.

wedge
02-14-2002, 10:44 PM
@Chris_S: Dorbie is right when you want use extensions DON'T by a card with MX we also hav problems on the GF2/MX. If I use the
GL_TEXTURE_RECTANGLE_NV and GL_EXT_texture_env_combine extension I get problems on the GF2/MX. But I never examine the Problem close, we do not suport MX Cards *g*.
The GF4/MX base on the GF2/MX. So I thing it will have the same Problems and do not suport all extensions.
So by a GF3 or GF4/TI. But the 4 do not have so many new things. A better schader a new AA methode and a it is litle faster. In my opinion the Intristing thing on GF4 is that you can use 2 Displays.
I heard the new AA methode on the GF4 only works with DirectX? Everyone know something?

Mr_Smith
02-15-2002, 12:54 AM
Does anyone have a simple program with antialiasing that i could use to test my Kyro 2,as my prog doesnt seem to work and i cant check the pixel formats as the Nvidia pixel format prog wont run on my pc either, i guess it requires an Nvidia card, i have a kyro 2 which supports antialiasing surposedly http://www.opengl.org/discussion_boards/ubb/frown.gif

Basically has anyone got a prog which will draw a triangle and show me it antialiased or crashing? http://www.opengl.org/discussion_boards/ubb/frown.gif

Diapolo
02-15-2002, 02:20 AM
To Chris_S:

You are on Borland, too, right?
Or are there forms in VC++, too?

I grabbed a simple code for creating a window from some webpage, could anyone have a short look if it could be problematic somewhere?




LRESULT CALLBACK WindowProcedure(HWND hWnd, UINT msg, WPARAM wParam, LPARAM lParam)
{
return DefWindowProc(hWnd, msg, wParam, lParam);
}

void SetTempWindow(void)
{
bool bSuccess = true;
HWND hWnd = NULL;
HGLRC hGLRC = NULL;

WNDCLASS wClass;
wClass.style = CS_OWNDC;
wClass.lpfnWndProc = WindowProcedure;
wClass.cbClsExtra = 0;
wClass.cbWndExtra = 0;
wClass.hInstance = HInstance;
wClass.hIcon = LoadIcon(HInstance, IDI_WINLOGO);
wClass.hCursor = LoadCursor(NULL, IDC_ARROW);
wClass.hbrBackground = NULL;
wClass.lpszMenuName = NULL;
wClass.lpszClassName = "OpenGL";

RegisterClass(&amp;wClass);

hWnd = CreateWindow("OpenGL", // Class Name
"Temp OGL Window", // Window Name
WS_DISABLED, // Window Style
CW_USEDEFAULT, CW_USEDEFAULT, // Starting Position
0, 0, // Breite und Höhe;
NULL, // parent handle;
NULL, // menu handle;
HInstance, // instance handle;
NULL); // other parameters;

if(hWnd == NULL)
{
// FEHLER
}

... more code
}


By the way, the AA works for me on GF3, with 2 or 4 samples, in 16 or 32 Bit and with the GL_NV_MULTISAMPLE_FILTER_HINT http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

[This message has been edited by Diapolo (edited 02-15-2002).]

ScottManDeath
02-15-2002, 03:41 AM
Hi

I also have a GF2MX400 and nvpixelformat doesn't recognize AA.
But when I manually force the driver to do 2x or 4x AA it gets rendered with AA but not under the control of the app.

Bye
ScottManDeath

Diapolo
02-15-2002, 05:11 AM
I think the solution to the whole problem is, that only GeForce3 (and higher) supports Multisampling AA (and therefore only GF3 and higher support the required GL and WGL Extensions).
I dunno what the ATI cards are capable of.

A GF2 / GF2MX and all the NV cards below GF3 support a form of AA, that is called Supersampling (Perhaps this can be turned on with: glEnable(GL_POLYGON_SMOOTH); The red book says, that blending has to be active in order to work. There is also a hint available, that could be usefull to chose the quality glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST or GL_FASTEST) http://www.opengl.org/discussion_boards/ubb/wink.gif.
But it could be, that the SS AA can only be turned on via the NV control panel.

Diapolo

[This message has been edited by Diapolo (edited 02-15-2002).]

tcobbs
02-15-2002, 07:41 AM
Originally posted by Diapolo:
A GF2 / GF2MX and all the NV cards below GF3 support a form of AA, that is called Supersampling (Perhaps this can be turned on with: glEnable(GL_POLYGON_SMOOTH); The red book says, that blending has to be active in order to work. There is also a hint available, that could be usefull to chose the quality glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST or GL_FASTEST) http://www.opengl.org/discussion_boards/ubb/wink.gif.
But it could be, that the SS AA can only be turned on via the NV control panel.

While the GL_POLYGON_SMOOTH_HINT will likely work, it's completely unrelated to FSAA. You have to sort your polygons back to front before drawing them in order for the hint to produce good results.

Also, while it runs significantly slower, the supersampling AA implemented on the GF2, etc cards is capable of producing significantly better results in certain situations (for example, when used with alpha textures).

One thing I forgot to ask. For those of you trying this and having it fail on GF2 hardware, do you have the latest drivers from nVidia installed? I don't think the extension was supported in the pre-GF3 drivers at all.

--Travis Cobbs

ScottManDeath
02-15-2002, 07:56 AM
Hi

I'm using th 27.42 drivers and as said befor nvpixelformat doesn't want to let me AA test http://www.opengl.org/discussion_boards/ubb/frown.gif

Bye
ScottManDeath

Chris_S
02-15-2002, 11:07 AM
Success. I traded the GF4/MX on a GF3/TI200, and yes it DOES have progammable PixelFormats. Basically anything above an MX should have these modes; but the GF2, GF3, or GF4 MX all do NOT. The MX cards show 57 PixelFormats and the TI200 shows me 76 PixelFormats. AA modes all over the place. Whooping big change. (BTW this card sure is speedy.)

I will post a link shortly to an OpenGL test program I have that will work on any card, such as Mr_Smith with his Kyro. But my guess is that it probably does not support PAA (programmable anti alias).

A lot of cards list HRAA, FSAA, SSAA, etc. on their specs but you can't tell if they have PAA. I'd bet all the bottom end cards do not.

I'd be interested to know which of the other bands and models support PAA. The test program I'll link shortly has a place to email me your cards results. It would be interesting to know just how many and which ones do support PAA. I'll post the results.

(TO: D, Yes I use Borland Delphi, C-Builder, JBuilder, MSVC, MASM.)

Thanks, Chris.

Diapolo
02-15-2002, 05:31 PM
What is programmable AA or what do you mean by it?

I think the MULTISAMPLE Extensions are only for doing MS AA and not for doing Supersampling AA.

If this is the case I donīt understand, why the other non GF3 / GF4 cards (should) list the WGL_ARB_multisample extension in the WGL extensions string.

You all said, that you could not get any Pixelformat, that supports AA with pre GF3 / GF4 cards, so I donīt think MS AA will work on these cards, at all.

But that leads me to the question, how to use application controlled AA on pre GF3 / GF4 cards, if not via GL_POLYGON_SMOOTH?

Would be pretty cool if someone could clear up these things http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

Chris_S
02-16-2002, 03:26 AM
Here is a link to my OpenGL test program. It will list all the extensions and PixelFormats of any 3D card, and give you a bunch of other stats on the card. I'd like to see the results of some of these other brands of cards. You can email me the results of your card using the link on the About tab.
http://www.linearx.com/cgi-bin/filebot.pl?/misc/OpenGLtp.zip

PAA is simply my term I use to indicate the presence of the ability to control the multisample mode under "progamming" control.

Supersampling I believe was a term created by Nvidia, and I don't think it is an OpenGL standard. I think Nvidia came out with that prior to the multisampling standard, and it has now been replace by multisampling. Supersampling was only controllable from the ControlPanel.

The MX cards can do multisampling, but you can only activate it globally using the Nvidia Control Panel (I guess you could consider this their supersampling mode). You can't do it using OpenGL calls from an application program. Hence what I mean by lack of 'PAA'.

As Wedge accurately stated, in oder to have 'PAA' there must be two extensions present: GL_ARB_multisample and WGL_ARB_multisample. The MX boards do not have the former.

GL_POLYGON_SMOOTH and GL_POLYGON_STIPPLE have nothing to do with multisampling. They concern the rasterization pattern within the polygon itself.

Bye.. Chris.

Diapolo
02-16-2002, 06:23 AM
The download to your test program seems broken, I only get a 1KB file and Winzip tells me itīs corrupt http://www.opengl.org/discussion_boards/ubb/frown.gif.

OK, I guess we have got the same opinion about the "PAA".

But WHY does NVIDIA list the WGL_ARB_multisample extension on non GF3 / GF4 cars, if itīs not usable for OGL programmers (we donīt get an AA capable Pixelformat and because of that application controlled AA is impossible on this cards under OGL)?

By the way, I donīt think Supersampling is an NVIDIA term, but I could be wrong.
It means, you render the scene in a higher resolution and sample it down to the screen resolution, which means edges get smoothed.

Perhaps a person from NVIDIA could make us clear, why the WGL extension is listed or why we canīt use it.

Diapolo

UPDATE:

This seems interesting (got it out of the OGL 1.3 specs)

F.3 Multisample
Multisampling provides a antialiasing mechanism which samples all primitives
multiple times at each pixel. The color sample values are resolved to a single, displayable
color each time a pixel is updated, so antialiasing appears to be automatic
at the application level. Because each sample includes depth and stencil information,
the depth and stencil functions perform equivalently to the single-sample
mode.
When multisampling is supported, an additional buffer, called the multisample
buffer, is added to the framebuffer. Pixel sample values, including color, depth, and
stencil values, are stored in this buffer.
Multisampling is usually an expensive operation, so it is usually not supported
on all contexts. Applications must obtain a multisample-capable context using the
new interfaces provided by GLX 1.4 or by the WGL ARB multisample extension.
Multisampling was promoted from the GL ARB multisample extension; The
definition of the extension was changed slightly to support both multisampling and
supersampling implementations.

--------------

There it says multisampling AND supersampling, but how can we use the supersampling http://www.opengl.org/discussion_boards/ubb/smile.gif?

[This message has been edited by Diapolo (edited 02-16-2002).]

Chris_S
02-16-2002, 08:58 AM
You probably tried to do a RightClick and got the html header page. That won't work. Just click on the link. The file bot should give you an initial html window that lists the file as 0.41MB. I just downloaded it and unzipped it with no problem.

Both the GL_... and WGL_... extensions have to be present to have programmable AA control. At least that is the way the Nvidia cards behave. Don't ask me why. If we get some other people downloading my test program, we may see if the other cards behave differently.

There are no "supersampling" tokens or extensions in OpenGL, that is why I say it is an Nvidia name.

The generic term for this whole area is AA - AntiAliasing. That is simply rendering to a higher resolution buffer, and then downsampling (averaging, interpolating) to the final screen resolution. OpenGL provides a standardized API call set under the name "multisampling" for this buffer.

Individual vendors have made marketing names up for this like High Resolution AntiAliasing (HRAA), Full Screen AntiAliasing (FSAA), Accuview, etc. Nvidia has their special Quincuix AA algorithm, on and on. They all add different bells and whistles to the "AA" arena. But many if not most of the AA solutions are only controllable from the Control Panel - not through the app.

Bye.. Chris.

Diapolo
02-16-2002, 09:33 AM
The generic term for this whole area is AA - AntiAliasing. That is simply rendering to a higher resolution buffer, and then downsampling (averaging, interpolating) to the final screen resolution. OpenGL provides a standardized API call set under the name "multisampling" for this buffer.


I donīt agree with you.
The generic term is Antialiasing, thatīs right, but Multisampling and Supersampling are 2 very different methods of achieving the Antialiasing.
Supersampling renders scenes in higher resolutions and samples it down to the screen resolution, while multisampling is achieved via a set of screen resolution images, that are rotated in some way and modulated together (donīt know if thatīs exactly like it works, but I think that should be the right direction for multisampling).



There are no "supersampling" tokens or extensions in OpenGL, that is why I say it is an Nvidia name.


There are no tokens, that have got supersampling in it, but the OpenGL 1.3 Specification mentions:



Multisampling was promoted from the GL ARB multisample extension; The
definition of the extension was changed slightly to support both multisampling and
supersampling implementations.


And there you see, multisampling and supersampling are 2 different forms of implementing the AA http://www.opengl.org/discussion_boards/ubb/smile.gif.

But still the question:

I know pre GF3 / GF4 cards do Supersampling, but how can we control it, if the GL 1.3 Specs say there is a way?
Perhaps NVIDIA has to update his drivers in order to get the supersampling AA to work?

Diapolo

UPDATE:

I got your test program and it works (had to disable my download manager). I have got 84 Pixelformats and many of them with AA sample buffers (2 and 4 samples), like I saw before with the NVIDIA PixelFormat 1.0 App.

GeForce 3 with 27.30 drivers!

[This message has been edited by Diapolo (edited 02-16-2002).]

Chris_S
02-16-2002, 10:05 AM
As your quote says:

"Multisampling was promoted from the GL ARB multisample extension; The definition of the extension was changed slightly to support BOTH multisampling and supersampling implementations."

We may be splitting hairs here. They are talking about slight differences in implementation, and as it says they were BOTH rolled into the common "multisample" calls in OpenGL.

There are different option modes (1.5x1.5), (2x2), (4x4), Quin, etc. which may reflect these differences in the resolutions or algorithms used, and supersampling may at one time have been called one of these specific modes. But, my point is there is only one AA frame buffer and it is accessed by the "multisample" catagory of OpenGL calls.

Bye... Chris.

Chris_S
02-16-2002, 02:59 PM
I got interested in looking further at the AA capabilities of these other cards, so I downloaded some info from ATI and Kyro. The Kyro doc gives some general mention of AA abilities but not much in the way of public technical details. Whether it has prog AA is hard to determine, but doubtful. Perhaps Mr_Smith can test his card and report back.

The ATI site had some interesting technical info. They are now pushing a new AA method called "SmoothVision" with oversampling levels of 2X, 3X, 4X, 5X, and 6X. But the new approach they are attaching to the "SmoothVision" moniker is "jittered" oversampling pixel locations. Rather than using fixed pixel coords, either straight or on an angle, they are moving them around in a somewhat random fashion. Just like audio dithering, we now have video pixel dithering.

As ATI mentions the terms supersampling and multisampling are becoming blurred:

"The two most commonly used anti-aliasing solutions are 'Super sampling' and 'Multi-sampling'. Both of these methods blend sub-pixel samples to correct aliasing, but generate pixel samples in different ways. The exact distinction between these two methods is somewhat unclear, and has been defined mostly by marketing material more than anything else."

While ATI has some good enhancments to the AA scene, it appears that they are not at present supporting these AA modes under programmable control. I downloaded two files glATI.h and wglATI.h which they state give the OpenGL Extension support available for their cards:

/* GL_ARB_multisample
** Rage 128 * based : Not Supported
** Radeon * based : Not Supported
*/
/* WGL_ARB_multisample
** Rage 128 * based : Not Supported
** Radeon * based : Not Supported
*/

In spite of their new AA enchancements, it appears that ATI is not yet supporting prog AA at app runtime. Control Panel setup is the only way. Perhaps that will change in the future.

So at present the only video cards (at least in the PC market) that fully support programmable AA seem to be the Nvidia TI or higher cards.

Bye.. Chris.

Mr_Smith
02-17-2002, 05:18 AM
I tried the program

On the Kyro 2 the only information it could get was the extensions and the resolution ( as well as the system info ) everything else crashed including the pixelformats tab which came up with the error invalid DC.
On the Rage 128 i had the same error exactly except it could get the name of the card.
On the Savage 4 everything worked, the tests all ran at about double the good value except the last two tests which locked the system up.

Was this written for Nvidia cards?
Help?

ps all cards were in diffrent systems etc all running windows 98.

Extensions list
Kyro.2
GL.ARB.multisample................GL.ARB.multitext ure...............GL.ARB.texture.compression
GL.ARB.texture.env.add............GL.ARB.texture.e nv.combine........GL.ARB.texture.env.dot3
GL.EXT.abgr.......................GL.EXT.bgra..... ..................GL.EXT.compiled.vertex.array
GL.EXT.draw.range.elements........GL.EXT.packed.pi xels..............GL.EXT.secondary.color
GL.EXT.separate.specular.color....GL.EXT.stencil.w rap...............GL.EXT.texture3D
GL.EXT.texture.compression.s3tc...GL.EXT.texture.e nv.add............GL.EXT.texture.env.combine
GL.EXT.texture.filter.anisotropic.GL.EXT.vertex.ar ray...............GL.S3.s3tc
WGL.ARB.extensions.string.........WGL.EXT.swap.con trol..............WGL.ARB.pixel.format
WGL.EXT.swap.control..............GL.EXT.bgra.

Rage.128
GL.ARB.multitexture...............GL.ARB.texture.b order.clamp.......GL.ARB.texture.env.add
GL.ARB.transpose.matrix...........GL.ARB.vertex.bl end...............GL.ATIX.pn.triangles
GL.ATI.texture.mirror.once........GL.ATI.vertex.st reams.............GL.EXT.abgr
GL.EXT.bgra.......................GL.EXT.clip.volu me.hint...........GL.EXT.compiled.vertex.array
GL.EXT.draw.range.elements........GL.EXT.fog.coord ..................GL.EXT.packed.pixels
GL.EXT.rescale.normal.............GL.EXT.secondary .color............GL.EXT.separate.specular.color
GL.EXT.texgen.reflection..........GL.EXT.texture3D ..................GL.EXT.texture.edge.clamp
GL.EXT.texture.env.add............GL.EXT.texture.e nv.combine........GL.EXT.texture.object
GL.EXT.vertex.array...............GL.KTX.buffer.re gion..............GL.MESA.window.pos
GL.NV.texgen.reflection...........GL.SGI.texture.e dge.clamp.........GL.SGIS.texture.border.clamp
GL.SGIS.texture.lod...............GL.SGIS.multitex ture..............GL.WIN.swap.hint
WGL.EXT.extensions.string.........WGL.EXT.swap.con trol..............WGL.ARB.make.current.read
WGL.ARB.pbuffer...................WGL.ARB.pixel.fo rmat..............WGL.EXT.swap.control
GL.EXT.bgra

Savage.4
GL.EXT.abgr.......................GL.EXT.bgra..... ..................GL.EXT.clip.volume.hint
GL.EXT.compiled.vertex.array......GL.EXT.packed.pi xels..............GL.EXT.stencil.wrap
GL.EXT.vertex.array...............GL.KTX.buffer.re gion..............GL.S3.s3tc
GL.SGI.cull.vertex................GL.SGI.index.arr ay.formats........GL.SGI.index.func
GL.SGI.index.material.............GL.SGI.index.tex ture..............GL.WIN.swap.hint

Seems to totally lose spaces when i use em
Also does anyone know what the SGI extensions are on the savage 4?

[This message has been edited by Mr_Smith (edited 02-17-2002).]

Chris_S
02-17-2002, 11:53 AM
Thanks for the info. Either the other boards are producing a very different (unexpected) set of extensions, or they have some bugs in their drivers. I've seen crashes on other cards that were due to their drivers.

I'll study your extension strings and see if I can find something. I hope you sent me the full data through the email link too. That is most helpful.

Bye.. Chris.

Mr_Smith
02-17-2002, 11:55 AM
i tried the email link but while trying to examin the systems it crashed like when trying normally

Chris_S
02-18-2002, 03:56 AM
I just uploaded a new build (106) of the OpenGLtp program. It should be a little more robust now, more error traps/msgs, and it only tries to setup an AA context on the "6-Tests" tab. All of the other text info tabs now use the legacy routines which should always pass with any card.

This program should run with any 3D card. However the problem is that the behavior of these other cards with the newer wgl procs and extensions is somewhat unknown.
They may not behave like the Nvidia cards/drivers, which is what I have mainly tested with.

We have a Savage4 board at the office around here somewhere that I tested months ago. It did have problems (bugs) in it's drivers that only showed up on some of the graphics tests.

It looks like your Kyro returns GL_ARB_multisample but not WGL_ARB_multisample. That is exactly backwards of the Nvidia MX units. Like I said, these other brands may act different.

Give it another try. You should get a pixel format listing this time, and then you will see for certain whether it has AA formats.

Bye.. Chris.

Diapolo
02-18-2002, 06:39 AM
I saw that your program lists GL_EXT_bgra as a GLU-Extension ... why?

By the way, in my eyes NVIDIA produces the BEST OpenGL drivers, with least bugs ... so I guess itīs really common, that drivers from other vendors have to be treated with much more safety code http://www.opengl.org/discussion_boards/ubb/biggrin.gif.

I really would like to get a statement from a NVIDIA OGL engineer about the AA thing, or perhaps someone from ATI could explain, if Radeon and other ATI cards will support "programmable AA".

Diapolo

[This message has been edited by Diapolo (edited 02-18-2002).]

Chris_S
02-18-2002, 09:59 AM
There are three extension string calls make for the GL, WGL, and GLU. I show you exactly what comes back from each of these calls in individual fields. That's what the driver's returned.

Yes, when it comes to 3D drivers, support, and performance Nvidia is way ahead of the pack. Of course there's not much of a pack left. With S3 gone, 3dfx gone, and now ATI merging with Hercules the pickings are slim.

The ATI cards don't support prog AA. They say so in their code, and if you look at Mr_Smith's listing for the Rage128 it lacks both the GL_ARB_Multisample and WGL_ARB_multisample extensions, which matches exactly what their doc said.

Bye.. Chris.

sdomine
02-18-2002, 02:58 PM
Here is quick answer from what I understand people are asking about WGL_ARB_multisample and nVidia cards:

1) Why is WGL_ARB_multisample exposed on GF256/GF2 if I can't get any multisampled pixel format?

WGL_ARB_multisample doesn't tell whether or not the current card is supporting multisample pixel formats. It only allows to query for multisampled pixel format, without making the driver crash:-) . In the case of the GF2MX, you will be able to query it, but you won't get any multisampled pixel format.

2) How can I tell if my card supports multisample pixel format?

If GL_ARB_multisample is exposed in the extension string.

3) Why isn't GL_ARB_multisample exposed for GF MX cards in supersampling mode?

Because, we don't fully comply with the requirements.

Hope this somewhat helps,

-s

Mr_Smith
02-22-2002, 02:27 PM
I tried downloading the new program it downloads 400k ish but winzip says its corrupt?

+the Kyro supporting GL_ARB_multisample does that mean it supports multisample?
and does anyone know how i can use it as i dont have the other extension?

Diapolo
02-22-2002, 06:10 PM
Why donīt you ask PowerVR for advice? Perhaps they have got an answer for you.

I donīt think you will be able to use the AA, if you canīt request a Pixelformat, that is capable of AA (WGL_ARB_multisample would be necessary for that).

Did you try the NV Pixel Format, I donīt think itīs "NVIDIA card only".

You could try to use the WGL_ARB_multisample tokens, even if the extension is not listed in the WGL extension string, perhaps it works and only the extension listing is missing in the current drivers.

Good luck!

Diapolo

Chris_S
02-22-2002, 09:52 PM
Yep the ZIP was bad. This is just plain weird. I zipped up a fresh copy, uploaded it again via FTP, then downloaded it, and it was bad too. Something seems to have went haywire at the ISP with the FTP server, or my FTP client is hosed.

Anyway I uploaded another copy using IE5 (html) and just downloaded it, and it unzipped fine.

Yes, see if you get any AA PixelFormats listed on Tab-5.

Like I said before, this program should work with any 3D board.

Bye.. Chris.

Mr_Smith
02-23-2002, 08:11 AM
Program now runs, all pages work, not sure about the info on one of them, the limits one
Element Verticies/Indicies, Texture Level and Min.Max = 2097151k?

i get no pixel format list
the tests all run about the good setting except
Pipes 20 fps http://www.opengl.org/discussion_boards/ubb/smile.gif
Planes 920fps http://www.opengl.org/discussion_boards/ubb/smile.gif
Sprites 500 fps http://www.opengl.org/discussion_boards/ubb/smile.gif
Particles 400 fps http://www.opengl.org/discussion_boards/ubb/smile.gif
Kyro culling is nice http://www.opengl.org/discussion_boards/ubb/smile.gif

Im trying to contact PowerVR about the antialiasing now.

ps why dont i get any pixel formats?
and what is this extension it says mine supports? WGL_ARB_pixel_format

Chris_S
02-23-2002, 01:09 PM
I'm glad you finally got some results. Are you saying that the only Extension string you saw listed in all three windows was just WGL_ARB_pixel_format? That's it?

According to the OpenGL Extensions Registry, you must have WGL_ARB_extensions_string to have WGL_ARB_pixel_format.

You get no pixel formats listed - strange. Obviously if you are seeing the graphics tests, it found a pixel format. Which pixel format index is shown that it is running the tests with?

Why they don't show up in the enum list is wild. The code simply tests for the "wglGetPixelFormatAttribivARB proc". If present, then it uses that function to enum the formats. If not present, then it uses the old "DescribePixelFormat" to enum the list. Yet you get no list.

I may have to buy some of these other cards like the Kyro to test with as well.

Bye.. Chris.

Chris_S
02-23-2002, 01:46 PM
I put another new version on the server (build-112). I'm making some assumptions here blind.

Assuming that the wglGetPixelFormatAttribivARB proc is getting a mapped address, but it is somehow failing to return any formats, I now force it to call the old DescribePixelFormat if no formats are returned from the new proc. Hopefully that should always produce a listing.

I am also now displaying the Enum proc that was actually used below the format ListView.

If you can, try this build.

Bye.. Chris.

Mr_Smith
03-01-2002, 06:19 AM
It works now and tells me the pixel formats but lists none as having AA http://www.opengl.org/discussion_boards/ubb/frown.gif

Does this means control panel is the only way to activate it?
http://www.opengl.org/discussion_boards/ubb/frown.gif

Chris_S
03-01-2002, 04:39 PM
Yep. I didn't think that the Kyro would have progammable AA. I'd appreciate getting a copy of your results for our 3D support, click on the SendData link in About and it will do it all automatically.

If you want to get a board with prog AA, the lowest cost solution is to buy a GF3/TI200. I paid about $160.

Bye.. Chris.

Mr_Smith
03-02-2002, 05:15 AM
I clicked send, dunno if it worked though as my hotmail account is dreadful.

p9931734
03-04-2002, 06:55 PM
help!!!

when i tried
glSampleCourageARB = (AAA)wglGetProcAddress("glSampleCoverageARB"); ,
the function is assign as 0x000000 , ?
however all of my the code for wgl Extension work well.

Thank in Advance

Chris_S
03-04-2002, 11:59 PM
According to the GL_ARB_MULTISAMPLE document there are two dependencies:

WGL_EXT_extensions_string is required.
WGL_EXT_pixel_format is required.

Read the above document in the opengl.org web site in the Extension Registry. Your card may not support these functions.

p9931734
03-05-2002, 04:52 PM
when i use wglgetextensionstringarb, the result is "wgl_arb_buffer_region,
wgl_arb_extension_string,
wgl_arb_multisample,
wgl_arb_pbuffer,
wgl_arb_pixel_format,
wgl_ext_extension_string,
wgl_ext_swap_control",
hence i believe the driver and hardware does support this function. so any other possible solution. THANK http://www.opengl.org/discussion_boards/ubb/smile.gif. and thank chris_s

Chris_S
03-05-2002, 07:24 PM
If you are getting those strings, then yes the function should get a mapped address. I just checked this in one of my apps, and it did return an address to the proc. This is the exact call I make:
glSampleCoverageARB := wglGetProcAddress('glSampleCoverageARB');

This was on a GF3/TI200, with 23.11 driver. What video card and driver are you using?
Try the latest driver.

Bye.. Chris.

Chris_S
03-05-2002, 07:26 PM
Oh, one more thing. You can only make this call when you've got an active rendering context. Don't know if you are aware of that.

Bye.. Chris.

p9931734
03-06-2002, 06:10 AM
i'm using else syn 2 card( As for the driver i'm not very sure, but i;m petty sure that it 's the latest driver.)

i did called the function after i have a active rendering context, but it's still not working. :<

Diapolo
03-06-2002, 07:57 AM
This should be the C code to get the function pointer:

glSampleCoverageARB = (PFNGLSAMPLECOVERAGEARBPROC)wglGetProcAddress("glSampleCoverageARB");

I dunno why you used (AAA) ... perhaps thatīs the problem?

Diapolo

p9931734
03-06-2002, 02:31 PM
Actually i didn't use (AAA) in my program, it's just that the code is a bit long and i'm in a rush, hence i replace it with (AAA). But anyway, the PFNGLSAMPLECOVERAGEARBPROC is a user -defined data type using typeof, hence i think it's will be just fine to use any nameing conviention

MikeCarter
12-18-2002, 11:30 AM
The problem with not seeing and PFDs with multisample capability is that you have to turn UNIFIED BACK BUFFER *OFF*. This same thing had me stumped for a while, too.

halo
11-14-2004, 03:38 PM
What a horrible design.

How about a new command.

AntiAlias True.

rgpc
11-14-2004, 07:07 PM
Personally I didn't find ARB_multisample too bad at all. Once you setup your pixel format you can just use glEnable/glDisable to turn it off etc.

Of course this thread is almost two years old so people are probably used to ARB_multisample by now - and I doubt the ARB would have designed the way they did without reason (for example - you need a context to be able to query the pixel formats so you have to create one first, then you can create another with your desired multisample level). Just specifying "multsample = true" ignores the fact that multisample will be available a certain resolutions, probably certain depths, with certain features enabled/disabled and in a variety of levels (2x, 2Qx, 4x etc).

halo
11-15-2004, 08:45 AM
I don't care about all that! I just want my screen to not be jagged!

dorbie
11-15-2004, 03:58 PM
Originally posted by halo:
I don't care about all that! I just want my screen to not be jagged!If you want to be lazy about it then just go to your desktop settings and force antialiasing on.

AA cannot be as programmatically simple as you want it to be. There are framebuffer resources and hardware configuration requirements for this. In addition there are more features possible with multisample AA like multiple passes for higher quality and alpha to mask for order independent transparency.

The OpenGL API pretty clean, believe it or not.