Crashes on GF3

I’m working on this little game engine:
http://hem.passagen.se/emiper/3d.html

There’s a demo based on this engine I released about a week ago on the page above, which should work on Radeons and GF3/4. Problem though is that it crashes on GF3’s on all drivers. Since I don’t have a GF3 I can’t really debug it.
The GF3 takes the same path as Radeon 7xxx series cards and should be able to run it without problems. However, this decently recent topic tells me there might be a problem with 3d textures in nVidia’s drivers. Since I’m heavily dependent on 3d textures I suppose this could be the problem, but I really don’t know.

Anyway, since nVidia didn’t want to reply to the mail I sent to them about this problem I’ve put up the source code here if any helpful soul with a GF3 want to help me. Would be cool if someone tried to run it in debug mode and find where the crash occures.

Humus -

I haven’t noticed any problems with 3d textures on the geforce3, and I’ve used them quite a bit.

If someone doesn’t beat me to it, I’ll try your program tonight when I get home.

– Zeno

the crash occurs in RenderTexture::upload() because wglChoosePixelFormatARB points to NULL

Ah!

That pretty much explains it all. I did some experimenting with rendering to texture in this project, but did away with it, even though I’m thinking of putting it into it again at a later time. Somehow I forgot to remove the code for creating the render texture though when I did away with it …

Thanks for helping me. I guess I’ll put out a new version of it tonight then

@Humus:

Could you please post in this thread, if the fixed version is online, I really would like to see what you´ve done .

By the way, the problems I had with 3D textures disappeared, really dunno why, but now I can watch the demos, that didn´t work.
I use 27.30 drivers currently.

Diapolo

[This message has been edited by Diapolo (edited 02-20-2002).]

Humus,

I just tried your demo (I assume it’s updated since it doesn’t crash anymore) but now I’m only seeing a white/black slowly flashing screen with some white lines and music.

P!!! 933 - 256 MB - GeForce3 Ti 200

(don’t know if I can find some time to debug the demo)

Yeah, I updated it a few hours ago and posted it on the forums beyond3d.com hoping it would work perfectly now. Well, the crash is at least gone, but things aren’t working quite right. After studying various screenshots I think the problem is that compressed texture doesn’t get uploaded correctly, probably getting inconsistent. I know from earlier experiences that nVidia’s drivers are more sensitive to inconsistent textures, they will produce all white while ATi’s drivers will work even if some mipmap levels are missing.

Anyway, I’ve added support for register combiners and you may edit the .ini file to enable it. It looks like this is working (except for the missing textures) which I find quite amazing since I have no hardware to test on, just wrote the code carefully reading the spec.

I’ve just uploaded new sources too (same link as above) if anyone wants to help debug it. If my theory about compressed textures being the problem I suppose changing from .dds to any other arbitrary image format would make it work. I guess the problem would the lie in the upload2DImage() function in TextureUtils.h, probably something with the last mipmap levels. I’m not all that sure how s3tc actually handles mipmap levels smaller than 4x4, but I suppose it takes a full 4x4 block anyway, at least is seams that’s how it works in the .dds file format, so that’s how I’m doing it, if it’s supposed to be handled in another way I guess that’s the problem.

Originally posted by Humus:
I know from earlier experiences that nVidia’s drivers are more sensitive to inconsistent textures, they will produce all white while ATi’s drivers will work even if some mipmap levels are missing.
.

Ati is wrong.Running your engine with the software implementation(impossible I know) the texture would also show white. Matt said that if all the mipmaps level are not present, the texture unit should act as if disabled. Ati should fix that.

This caused me some headache with the vanilla Radeon. I add my texture filtering set to use mipmapping and I did not create any mipmaps.

The radeon was happily showing the texture up close, but when mipmap level changed, it would display trash. Very nice!!

The software implementation act as if the texture uint was disabled and was showing the last primary color.

Yeah, I know. It makes it harder to debug stuff when it’s working even though it shouldn’t because I have a bug.

Yes it’s annoying. BTW, nice demo even though I would prefer it with more colors!! (I have a geforce 3). What about putting an option to remove mipmaps as a workaround.

[This message has been edited by Gorg (edited 02-22-2002).]

Humus,

After fiddling with your source, I have finally managed to compile and run it on my machine. I’ll have a look on Monday to see if I can fix this texture compression problem with GF3…

Regards.

Eric

P.S.: noticed some memory leaks when exiting the program !

Originally posted by Gorg:
Yes it’s annoying. BTW, nice demo even though I would prefer it with more colors!! (I have a geforce 3). What about putting an option to remove mipmaps as a workaround.

Colors are coming I think I’ve solved the bug now, got imageSize parameter wrong. But ATi’s drivers weren’t exactly helping me finding that bug, they ignore this parameter all together.

This code,
glCompressedTexImage2DARB(target, level++, internalFormat, w, h, 0, 23/size/, src);
works just as well as
glCompressedTexImage2DARB(target, level++, internalFormat, w, h, 0, size, src);

No errors generated …

Oh well, the error was that I calculated the size parameter before shifting width and height one bit right, so I got the size of the previous mipmap level.

Originally posted by Eric:
P.S.: noticed some memory leaks when exiting the program !

How did you spot that, and do you have any info about where the leak is?

BoundChecker ?

Originally posted by Humus:
How did you spot that, and do you have any info about where the leak is?

Boundschecker is a good tool.

Anyway, in VS6 you can set some flags and you will have memory leak information when debugging from VS (in debug configuration).
Add next lines at the beginning of your code:
int flag = _CrtSetDbgFlag(_CRTDBG_REPORT_FLAG); // Get current flag
flag |= _CRTDBG_LEAK_CHECK_DF; // Turn on leak-checking bit
// flag |= _CRTDBG_CHECK_ALWAYS_DF; // Turn on CrtCheckMemory
_CrtSetDbgFlag(flag); // Set flag to the new value

(notice I have _CRTDBG_CHECK_ALWAYS_DF commented as it slows down the code too much. But, sometimes, it is good to test with it enabled).
You will have information about those flags in VS help.

Cool, never knew that features existed! But it’s kinda hard to know where the leak is … the output isn’t showing too much useful info, kinda hard to identify where that binary data comes from

Got it confirmed that it works on GF3 now! It does look a lot brighter on the screenshot I saw though, such that the bumps got unbumped, not sure if it’s real or fiction or just the JPEG compression.

First off, it looks great! I have a GF3 (original).

Just as a note to others trying to run it on a GF3, though: it doesn’t work at all if Anisotropic filtering is turned on in the nVidia control panel. It runs, and the music plays, but the graphics don’t even approach correctness. Setting Aniso back to Disabled made everything work fine, though.

Finally colors!! Looks great.

It runs fine, unless you have forced anisotropic filtering in registry. Good job!