PDA

View Full Version : @NVIDIA: GF4 Z-Buffer problem in OGL?



Diapolo
10-15-2002, 10:41 PM
Hi @all the NV guys here,

I heard from quite a few people, that they seem to have some sort of Z-Buffer problem (Z-fighting) in for example Serious Sam 2, while using 32 Bit color depth.
Their guess is, that the Z-Buffer is "broken" or that thereīs only a 16 Bit Z-Buffer used (that was my guess, too).
Could it be, that if an application coder requests a 32 Bit Z-Buffer on NV hardware, he getīs a 16 Bit one instead of a better fitting 24 Bit one.
Oh and this seems to be only a problem on GF4 class hardware.
My guess ist, that there might be an error in NVs ChoosePixelFormat algo.
And I think if the WGL_ARB_pixel_format extension would be used, the request for an 32 Bit Z-Buffer would generate an error and then the application coder could have seen, that he is using a 16 Bit Z-Buffer only.

Not a big thing, if this Bug really exists, or?
I canīt confirm it for myself, because I use a GF3, which doesnīt have this problem.

Could some NV guy look into this or have other people here similar experiences?

Regards,
Diapolo

Nutty
10-15-2002, 11:45 PM
If you ask for a 32bit Z buffer in windows, and one isn't available, it'll give you a 16bit one instead of a 24bit one.

Nvidia have never support 32bit Zbuffers, and the difference you'll get between 24bit and 32, is minimal. It's really a bug in windows really, or the application.

You should check to see if 32bit Z buffer is available, if not, request 24. Not leave it upto windows to give you something it thinks will do.

Nutty

Korval
10-15-2002, 11:58 PM
The odd part about this is that my programs (or, at least, the last ones I tested in my pre-GeForce4 days) didn't have a problem with degrading from a 32-bit z-buffer to a 24-bit one. If you asked for 32-bit, the driver would happily give you a 24-bit one. Later in my programming, I specifically request 24-bit Z with 8-bit stencil (for the stencil, of course), so I don't have a problem with newer software.

I guess something has changed in recent drivers. I'd get into the habit of just requesting the 24/8 buffer instead of the 32.

Diapolo
10-16-2002, 12:27 AM
You are both right.
I would request a 24 Bit Z-Buffer with a 8 Bit Stencil Buffer, too.
But it seems that quite a few serious game coders out there didnīt do this and now have a 16 Bit Z-Buffer, where they wanted a 32 Bit Z-Buffer (and could get a 24 Bit Z-Buffer).
I think one could blame the WGL ChoosePixelFormat function for that, because it defaults to a 16 Bit Z-Buffer on GF4, if a 32 Bit Z-Buffer is requested (but not available).
And another thing is, that I would always use the WGL_ARB_pixel_format extension.

But what IS strange, is the fact, that this seems to happen only on GF4 based cards.

So perhaps NV should look into this one and implement a work-arround or a fix for that.

I read in a forum, that the customers, that only PLAY the games, get really angry if they see the Z-fighting on GF4 but not on ATI cards.

Diapolo

mcraighead
10-16-2002, 12:57 AM
If the app doesn't ask for 24 bits of Z, it won't get 24 bits of Z. I don't see how the SS2 issue is anything other than an app bug. Since 16 bits of Z can be faster, we'd rather give an app less bits if it doesn't think it needs those bits.

- Matt

Diapolo
10-16-2002, 02:57 AM
Thanks for your reply Matt http://www.opengl.org/discussion_boards/ubb/smile.gif.

I think the App should properly check, that the Z-Buffer depth for the choosen PF is equal to the requested Z-Buffer depth and perhaps in SS2 this check isnīt there.
So from that point of view itīs the Apps fault.

You say in other words, that the NV OGL driver defaults to the lowes available Z-Buffer depth, if the requested Z-Buffer depth isnīt available.
32 Bits -> 16 Bits
24 Bits -> 24 Bits
16 Bits -> 16 Bits

But then, why does this only happen this way on GF4 class hw and not on GF3 (like I said, I only tell about other users experiences)?

Any idea?

Diapolo

ToolChest
10-16-2002, 05:46 AM
Originally posted by mcraighead:
If the app doesn't ask for 24 bits of Z, it won't get 24 bits of Z. I don't see how the SS2 issue is anything other than an app bug. Since 16 bits of Z can be faster, we'd rather give an app less bits if it doesn't think it needs those bits.

- Matt

That’s bs… If I ask for 32bit I’m obviously interested in the precision and not the speed.

Relic
10-16-2002, 06:41 AM
Never forget the wonders of ChoosePixelformat. You can't rely on anything when using that.
Long story short, just do your own pixelformat evaluation with DescribePixelformat or the wgl-extensions to be sure you're getting what your asking for.

Nutty
10-16-2002, 07:48 AM
That’s bs… If I ask for 32bit I’m obviously interested in the precision and not the speed.

If you ask for 32bit, and dont bother checking what you actually get, then you can't be that bothered really, can you?

If you dont get 32bit, then ask for 24bit, if you dont get that, then try with 8bit stencil, if you still dont get it, then resort to 16bit.

It's not exactly rocket science now is it?

Nutty

Humus
10-16-2002, 08:33 AM
The point is that one should get as close as possible to what's requested. If the card supports 16 and 24, and the app requests 32 it doesn't make much sense to give it 16.

ToolChest
10-16-2002, 08:33 AM
Originally posted by Nutty:
If you ask for 32bit, and dont bother checking what you actually get, then you can't be that bothered really, can you?

If you dont get 32bit, then ask for 24bit, if you dont get that, then try with 8bit stencil, if you still dont get it, then resort to 16bit.

It's not exactly rocket science now is it?

Nutty

ya nice try... here is a quote from the msdn under ChoosePixelFormat

"If the function succeeds, the return value is a pixel format index (one-based) that is the closest match to the given pixel format descriptor."

hmmm... ya, I think 24 is closer to 32 than 16 is... http://www.opengl.org/discussion_boards/ubb/tongue.gif

it would be nice if the function worked properly...

SirKnight
10-16-2002, 09:36 AM
Yeah I totally agree with john and humus. It does NOT make sence that if you request a 32bit Z that it gives you 16bit if the hardware can not support a 32bit Z. To me it's very obvious that if one specifies 32bits then one wants a high precision Z-buffer.

Yeah it would be nice if ChoosePixelFormat DID work like it's supposed to. http://www.opengl.org/discussion_boards/ubb/smile.gif The best thing now is to ofcourse do checks to see if what you requested can be chosen, if not then specify the next best thing yourself. But really we shouldn't have to do that.

-SirKnight

ToolChest
10-16-2002, 10:05 AM
Quick question:

Some of the older DX games would give the player a list of supported formats to choose from. Do you think this is a cool feature or too much of a hassle for the average gamer? I’ve never been able to make a decision either way…

Thanks…

John.

Btw: the from what I remember you had to choose before playing.

[This message has been edited by john_at_kbs_is (edited 10-16-2002).]

dorbie
10-16-2002, 10:12 AM
Come on, John has a point.

To make proclamations about the logic of defaulting to fast if you don't ask for 24 bit when 32 bit has been requested doesn't hold a lot of water.

This stuff has always been a complete mess though.

Matching pixel formats is not always straightforward. There may be some other visual property being requested that makes the 16 bit z visual a 'closer' match.

Diapolo
10-16-2002, 11:03 AM
You all should see, the gamers point of view, too.
If one sees Serious Sam 2 on his GF4 and wonders about the "Z-Fighting artifacts" and then compares this to SS2 on an ATI Radeon card or GF3 card (where there seems to be a 32 or 24 Bit Z-Buffer), then he thinks his GF4 is broken or the drivers do SUCK (which is bad for NVs financials and credit from itīs customers).

If this is really the point, then NV should change the default behavior of the PF choosing algo, so that a high precision Z-Buffer is choosen, if one is requested (if 32 or 24 Bits doesnīt matter, only "high" precision).

Are there observations from you, where you saw Z-fighting in OGL games on GF4, but not on other cards, where there should not be Z-fighting?

Like I said I canīt prove it to be a GF4 only problem, but I really would like to track this "bug" / "default behavior of the driver" down and then talk about a possible solution http://www.opengl.org/discussion_boards/ubb/wink.gif.

Regards,
Diapolo

mproso
10-16-2002, 11:57 AM
This is probably off topic but did anyone noticed this in PIXELFORMATDESCRIPTOR specs:

cColorBits
Specifies the number of color bitplanes in each color buffer. For RGBA pixel types, it is the size of the color buffer, excluding the alpha bitplanes. For color-index pixels, it is the size of the color-index buffer.

So according to this we should request 24 bits for RGB and 8 bits for cAlphaBits parameter to get 32 bit RGBA color, but we all use 32 for cColorBits parameter.
This is from WGL_pixel_format spec:

WGL_COLOR_BITS_ARB
The number of color bitplanes in each color buffer. For RGBA pixel types, it is the size of the color buffer, excluding the alpha bitplanes. For color-index pixels, it is the size of the color index buffer.

Same thing.
Am I reading spec right or what?

Thanks.

Diapolo
10-16-2002, 12:00 PM
This is my code for your question and it works :).

iAttributes[10] = WGL_COLOR_BITS_ARB;
iAttributes[11] = GetCurColorDepth() == 32 ? 24 : 16;
iAttributes[12] = WGL_ALPHA_BITS_ARB;
iAttributes[13] = GetCurColorDepth() == 32 ? 8 : 0;
iAttributes[14] = WGL_DEPTH_BITS_ARB;
iAttributes[15] = GetCurColorDepth() == 32 ? 24 : 16;
iAttributes[16] = WGL_STENCIL_BITS_ARB;
iAttributes[17] = GetCurColorDepth() == 32 ? 8 : 0;

Diapolo

ToolChest
10-16-2002, 12:03 PM
Ya, I read it the same way. I'm not near my dev pc, but I swear I get 32 back for color bits. Thats strange... http://www.opengl.org/discussion_boards/ubb/confused.gif

Diapolo
10-16-2002, 12:07 PM
If you are talking about:
glGetIntegerv(GL_INDEX_BITS, &iIndexBits); I guess you get 32, because of 24 Bit RGB and 8 Bit Alpha.
And because of that itīs called GL_INDEX_BITS and not GL_COLOR_BITS?

Diapolo

BTW.: Back to the real Topic ;) ... please *g*.

ToolChest
10-16-2002, 12:13 PM
no, I don't use color index... will look at it when I get home, see if I'm crazy (highly posible)... http://www.opengl.org/discussion_boards/ubb/wink.gif

mcraighead
10-16-2002, 10:43 PM
You completely misinterpreted what I said.

If the app asks for 16, we will give them 16.

If the app asks for 24 or 32, we should give them 24.

But if an app asks for 16, it is just stupid for us to give them 24.

To the best of my knowledge, SS asks for 16.

Also, we don't implement ChoosePixelFormat, Microsoft does. We only implement the WGL_ARB_pixel_format version of it. Most likely, if you ask for 32, you will get a Microsoft pixel format that has 32 depth bits but that is not accelerated...

- Matt

Diapolo
10-17-2002, 12:32 AM
Originally posted by mcraighead:
You completely misinterpreted what I said.

If the app asks for 16, we will give them 16.

If the app asks for 24 or 32, we should give them 24.

But if an app asks for 16, it is just stupid for us to give them 24.

To the best of my knowledge, SS asks for 16.

Also, we don't implement ChoosePixelFormat, Microsoft does. We only implement the WGL_ARB_pixel_format version of it. Most likely, if you ask for 32, you will get a Microsoft pixel format that has 32 depth bits but that is not accelerated...

- Matt

Hi Matt,

Iīm pretty sure, most of use didnīt misinterpret your statement.

Itīs obviouse, that I want 16 Bit of Z-Buffer, if I request 16 Bit, there is no discussion, that there then should be 24 Bits of Z-Buffer http://www.opengl.org/discussion_boards/ubb/smile.gif.

What I thought is, that if a 32 Bit Z-Buffer is requested via ChoosePixelFormat, a 16 Bit one is choosen.
And that is what I complain about.

There seems to be a difference between GF3 / ATI Radeon and GF4 in SS2.
Are you really sure this game only requests a 16 Bit Z-Buffer?

I have been told, on GF4 there is obvious some sort of Z-fighting / flickering, but that is not visible:
1. in D3D mode
2. on GF3 and ATI Radeon cards

And I heard from several people, that thatīs not only the case in SS2, so perhaps you could look into it??

Regards,
Diapolo

harsman
10-17-2002, 02:08 AM
Diapolo, games that flicker on your GF4 probably only request 16 bits of deth, or don't care what depth they get. However, they *do* probaly request 32bits of color which on old GeForce cards used to give you 24 bits of depth. They relied on this behaviou, so they break when they get what they ask for instead of what the hw was forced to give them in the old days.

Tobias
10-17-2002, 02:13 AM
Originally posted by mcraighead:
You completely misinterpreted what I said.

But if an app asks for 16, it is just stupid for us to give them 24.

To the best of my knowledge, SS asks for 16.
- Matt

I think, it is stupid not to fix the problem, because any OpenGL-Game (RTCW, Medal Of Honor, ...) looks terrible on a GeForce4 compared to a GeForce3 or an ATI-card. There are many angry GeForce4-Users (like me), and they getting more every day...

SeriousSam wants a 32Bit-ZBuffer... On every other card, the game gets a 32Bit or a 24Bit-ZBuffer, only on GeForce4-cards, it gets a 16Bit-ZBuffer.
This behaviour can be seen on many other games...

ToolChest
10-17-2002, 05:20 AM
Originally posted by mcraighead:
You completely misinterpreted what I said.

If the app asks for 16, we will give them 16.

If the app asks for 24 or 32, we should give them 24.

But if an app asks for 16, it is just stupid for us to give them 24.

To the best of my knowledge, SS asks for 16.


Ahhhh, yes I did... also didnt realize that the older games are requesting 16... in that case sounds like a game bug.

John.

Humus
10-17-2002, 09:22 AM
Originally posted by Tobias:
I think, it is stupid not to fix the problem, because any OpenGL-Game (RTCW, Medal Of Honor, ...) looks terrible on a GeForce4 compared to a GeForce3 or an ATI-card. There are many angry GeForce4-Users (like me), and they getting more every day...

The game writer should rather fix their games than driver having to guess what the app really wants or needs. It could be useful though to have a "force 24bit Z" checkbox in the driver's control panel if it's a common problem, if that doesn't exist yet.

knackered
10-17-2002, 10:42 AM
Just to add something - one of my fellow developers has the unfortunate task of writing an app using WorldToolKit (don't know if you've heard of it, it's quite an out of date scenegraph API).
Anyways, he is getting z-fighting artifacts on JUST the geforce4's in the office - it works fine with all the geforce2's. We naturally have the latest detonator drivers.
So it would seem this is a gf4-specific driver problem. Honestly, blaming the app is just not on - your drivers should at least be consistent.

knackered
10-17-2002, 11:08 AM
BTW, I must be missing something, because people seem to be pre-occupied with this choosepixelformat lark, and not the fact that there is obviously something wrong if the same drivers work fine with a gf2 but have z-fighting on a gf4. Are you saying that the gf2 cannot support a 16bit zbuffer, and therefore gives you a 24bit one instead, but the gf4 can support 16bits, and so gives that in preference?
Madness!

Humus
10-17-2002, 01:56 PM
GF2 only support 16bit Zbuffer in 16bit color mode. In 32bit you'll get 24bit Z. So it the app was developed on GF2 and run in 32bit mode asking for 16bit Z you'd get 24bit Z and you wouldn't see that the 16bit Z you're requesting really isn't enough.

Korval
10-17-2002, 02:37 PM
Given all of this, let me see if I can't derrive what nVidia's drivers may say at some point in pseudo-code.

non-GeForce4:




//Code to determine Depth bits
if(requestedColorBits == 32 && requestedDepthBits >= 24)
{
GivenDepthBits = 24;
}
else
{
if(requestedColorBits >= 16 && requestedDepthBits >= 16)
{
GivenDepthBits = 16;
}
else
{
{
return CANNOT_USE_THIS_ICD;
}
}
}


Note that the requestedColorBits is, essentially, given priority over the depth.

On a GeForce 4, it probably looks more sane, like this:




if(requestedDepthBits == 24)
{
givenDepthBits = 24;
}
else
{
givenDepthBits = 16;
}


A simple change from '==' to '>=' would fix the issue.


BTW, I'd like to take this opportunity to point out that it is precisely things like this (asking for driver fixes rather than app fixes) that caused nVidia to implement the GL_CLAMP/CLAMP_TO_EDGE bug in the first place. And it is precisely instances like this that keeps nVidia from fixing that bug. As such, everyone involved in clamoring for nVidia to fix it rather than the Serious Sam people no longer have the right to give nVidia crap about that bug.

(btw, I'm not asking for it to be fixed. Rather, I'm just pointing out how simple it probably is. Therefore, I retain my rights http://www.opengl.org/discussion_boards/ubb/wink.gif ).

[This message has been edited by Korval (edited 10-17-2002).]

[This message has been edited by Korval (edited 10-17-2002).]

[This message has been edited by Korval (edited 10-17-2002).]

Diapolo
10-17-2002, 03:38 PM
@Korval:

You are completely right.
I always mentioned, that it seems to be a GF4 specific problem and that the NV driver guys should investigate that http://www.opengl.org/discussion_boards/ubb/biggrin.gif.
A button to force a 24 Bit Z-Buffer, would be very nice, too ...

Diapolo

Relic
10-18-2002, 12:21 AM
Korval, that pseudo code doesn't make much sense.
Drivers expose what the hardware can do. If newer hardware exposes more pixelformats because it's possible to offer 32 bits colors with 16 bits depth, well, that obviously gets an additional different pixelformat id than a format offering 32 bits colors with 24 bits depth.
It's still the apps responsibility to choose the desired one from the list.

Nutty
10-18-2002, 07:16 AM
Well said Relic.

If something is done wrong by the app, which causes inconsistent results across different boards, it doesn't make it the drivers fault.

I'm always a bit wary with hard-forced check boxes. If you give users that, you remove the responsibility for app developers to get it right, meaning other card vendors have to implement them amid a torrent of complaints from users.

A good example of this is the good old, "why do I get black lines on textures on my ATI xxx, and not my nvidia xxx??".

App developers should do stuff properly in the 1st place!

Nutty

mcraighead
10-18-2002, 07:16 AM
You (e.g. Korval and others) all seem to be assuming that we have control over the pixel format selection algorithm. We *don't*.

We expose a list of pixel formats, and Microsoft and/or the app pick one. We do not implement ChoosePixelFormat! In fact, the app may not even call ChoosePixelFormat.

If Serious Sam needs more bits of Z than it selects, that is a bug in Serious Sam. (Why do you assume that Serious Sam asks for 32 bits of Z, anyhow? I wouldn't be surprised if it asks for 16.)

If Microsoft thinks 16 bits is closer to 32 than 24, that is a bug in Microsoft's opengl32.dll.

- Matt

Diapolo
10-18-2002, 07:39 AM
If Serious Sam needs more bits of Z than it selects, that is a bug in Serious Sam. (Why do you assume that Serious Sam asks for 32 bits of Z, anyhow? I wouldn't be surprised if it asks for 16.)


Hey Matt,

I only wonder, why it works on GF3 / ATI Radeon Cards, but people see Z-fighting on GF4.
Whatīs your guess for that observation?

I think developers should use ARB_pixel_format, where available, because the ChoosePixelFormat thing is quite a bit obsolete (the MS opengl32.dll) http://www.opengl.org/discussion_boards/ubb/biggrin.gif.

By the way, could NV integrate a button to force a 24 Bit Z-Buffer, what do you say?
I think that would be GREAT for users, that experience Z-Buffer problems with 16 Bit Z-Buffers http://www.opengl.org/discussion_boards/ubb/smile.gif.

Diapolo

mcraighead
10-18-2002, 08:17 AM
I suspect that it "works" on the cards that don't support, or at least expose, 16-bit Z.

I doubt that we would add a registry key for forcing 24-bit Z; or at least I would be rather unexcited about adding one.

- Matt

Humus
10-18-2002, 01:26 PM
I just checked with my Radeon 8500. Turns out I always get a 24bit Zbuffer even if I request 16bit. There is though an option in the control panel to force it to either 16bit or 24bit though.

Diapolo
10-18-2002, 01:47 PM
Perhaps thatīs an ATI work-around for bad coded engines http://www.opengl.org/discussion_boards/ubb/smile.gif?

Diapolo

PH
10-18-2002, 01:48 PM
I'm always a bit wary with hard-forced check boxes. If you give users that, you remove the responsibility for app developers to get it right, meaning other card vendors have to implement them amid a torrent of complaints from users.


That's a very good point and something that has been bothering me aswell. Take anisotropic filtering for example. Forcing this on will produce some interesting artifacts for per-pixel specular. It's okay to use anisotropic filtering with the normal map for diffuse light but it looks very bad with specular.

knackered
11-01-2002, 01:50 AM
I don't know where we're up to in this conversation, but I can absolutely confirm that if you setpixelformat with 32bits for the colour buffer, 0 bits for the stencil, and 32 bits for the zbuffer, you get a 16 bit Zbuffer! If you request a 24bit zbuffer, then you get a 24bit zbuffer, which is correct....but surely the driver (or MS layer) should fall down to 24bit from 32, rather than 16 which it is currently doing.
BTW, this is only on the Geforce4 (behaves logically on gf3 & below), so it must be a problem at NVidia's end, otherwise how do you explain it?

Diapolo
11-01-2002, 02:29 AM
I think, GF3 getīs 24 Bits of Z, because it has a HW limitation.
16 Bit Z-Formats on GF3 in 32 Bit color mode, have ALL got a Multisample-Buffer, so that could / should be the reason, why CPF chooses a 24 Bit Z-Buffer http://www.opengl.org/discussion_boards/ubb/biggrin.gif!

Diapolo

J_Kelley_at_OGP
11-01-2002, 04:28 AM
Hi everyone I was just reading along and thought I might add my input. I think the answer to this problem has already been pointed out along the way here. If the information provided is correct then we have the following situation:
Older cards (GF2/3) only give 16 bit Z buffer in 16 bit color mode. Which means the ChoosePixelFormat can only report 16 bit Z in that mode.

Newer cards (GF4) appear to be able to give 16 bit Z regardless of color depth.

Therefore if SS is requesting a 16 bit Z but was always running in > 16 bit color mode it would have gotten a bigger Z buffer than it asked for without consequence. NOW however it requests the 16 bit Z and gets the 16 bit Z causing the loss of precision everyone seems to be experiencing. This looks like just another case of new hardware offering better flexibility with unexpected results from older software.

As more than a beginner but less than a guru on OpenGL programming I've found it very difficult to distill down all the information I need to remember about various platforms and what works with them and what doesn't.

I'm unfortunately forced to program for S3 Savage 4 level cards in my job and I ran into similar problems with the Z buffer on that platform. If I don't ask for a 24 bit Z buffer with an 8 bit stencil then my window crashes, I could not get any other format to work with ChoosePixelFormat.

knackered
11-01-2002, 05:28 AM
No, the situation is this:-

Geforce3 & below:-

PIXELFORMATDESCRIPTOR....
Colourbits: 32
Zbits: 32
Stencilbits: 0

ChoosePixelFormat returns:
a 24 bit Zbuffer

Geforce4:-

PIXELFORMATDESCRIPTOR...
Colourbits: 32
Zbits: 32
Stencilbits: 0

ChoosePixelFormat returns:
a 16 bit Zbuffer!


I can't be any clearer than that.

Diapolo
11-01-2002, 07:06 AM
You are right, but read my last post.
I assume, that if GF3 wouldnīt have that HW limitation, we would get a 16 Bit Z-Buffer there, too (dunno what about GF2 or below).

Another thing you should know is, that SS2 currently leaves the cDepthBits field of the PFD-structure empty (= zero).
But this routine was rewritten and perhaps in a patch or in the next engine release this will be more robust and straight forward!

But what is curious, why NV or MS decide to give 16 Bits of Z (and not 24 Bits of Z), if a high quality Z-Buffer (32 Bits) is requested?

Diapolo

zeckensack
11-01-2002, 07:14 AM
And before somebody asks why ATI cards are unaffected:

Current ATI drivers always give you 24/8 stencil/z (in conjunction with 32bit color, I might add).

The hardware can do 16bit z, but you'll have to force it on in the driver panel.
That's why you don't see these issues on Radeons. Might after all indeed be MS' fault.

Won
11-01-2002, 07:18 AM
Moral: ChoosePixelFormat is weird.

Corollary: Check what ChoosePixelFormat gives you. Better yet, choose your own damn pixel format. It really isn't hard.

Observation: Matt is usually right about NVIDIA drivers, and it seems he's answered all your questions already. Direct your angst at Microsoft's unmaintained OpenGL code.

Here's a feature suggestion. Instead of adding registry settings to globally control various OpenGL settings why not do a "Quack" like thing. Have a database (probably in the registry, actually) that looks up driver settings based on the executable file name. Probably would be a pretty easy to do.

-Won

Diapolo
11-01-2002, 07:19 AM
Have a look at the PFs my GF3 gives me in 32 Bit color mode, you can easily see, that all 32 Bit Color formats with 16 Bit Z-Buffer have got a Multisample-Buffer:
http://Phil.Kaufmann.bei.t-online.de/Diapolo.html

Diapolo