PDA

View Full Version : Crashes on GF3



Humus
02-20-2002, 08:46 AM
I'm working on this little game engine:
http://hem.passagen.se/emiper/3d.html

There's a demo based on this engine I released about a week ago on the page above, which should work on Radeons and GF3/4. Problem though is that it crashes on GF3's on all drivers. Since I don't have a GF3 I can't really debug it. http://www.opengl.org/discussion_boards/ubb/frown.gif
The GF3 takes the same path as Radeon 7xxx series cards and should be able to run it without problems. However, this decently recent topic (http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/005497.html) tells me there might be a problem with 3d textures in nVidia's drivers. Since I'm heavily dependent on 3d textures I suppose this could be the problem, but I really don't know.

Anyway, since nVidia didn't want to reply to the mail I sent to them about this problem I've put up the source code here (http://hem.passagen.se/emiper/temp/Src.rar) if any helpful soul with a GF3 want to help me. Would be cool if someone tried to run it in debug mode and find where the crash occures.

Zeno
02-20-2002, 08:57 AM
Humus -

I haven't noticed any problems with 3d textures on the geforce3, and I've used them quite a bit.

If someone doesn't beat me to it, I'll try your program tonight when I get home.

-- Zeno

opla
02-20-2002, 09:45 AM
the crash occurs in RenderTexture::upload() because wglChoosePixelFormatARB points to NULL

Humus
02-20-2002, 11:49 AM
Ah! http://www.opengl.org/discussion_boards/ubb/smile.gif

That pretty much explains it all. I did some experimenting with rendering to texture in this project, but did away with it, even though I'm thinking of putting it into it again at a later time. Somehow I forgot to remove the code for creating the render texture though when I did away with it ...

Thanks for helping me. I guess I'll put out a new version of it tonight then http://www.opengl.org/discussion_boards/ubb/smile.gif

Diapolo
02-20-2002, 12:54 PM
@Humus:

Could you please post in this thread, if the fixed version is online, I really would like to see what you´ve done http://www.opengl.org/discussion_boards/ubb/smile.gif.

By the way, the problems I had with 3D textures disappeared, really dunno why, but now I can watch the demos, that didn´t work.
I use 27.30 drivers currently.


Diapolo

[This message has been edited by Diapolo (edited 02-20-2002).]

richardve
02-21-2002, 10:34 AM
Humus,

I just tried your demo (I assume it's updated since it doesn't crash anymore) but now I'm only seeing a white/black slowly flashing screen with some white lines and music.

P!!! 933 - 256 MB - GeForce3 Ti 200


(don't know if I can find some time to debug the demo)

Humus
02-21-2002, 11:45 AM
Yeah, I updated it a few hours ago and posted it on the forums beyond3d.com hoping it would work perfectly now. Well, the crash is at least gone, but things aren't working quite right. After studying various screenshots I think the problem is that compressed texture doesn't get uploaded correctly, probably getting inconsistent. I know from earlier experiences that nVidia's drivers are more sensitive to inconsistent textures, they will produce all white while ATi's drivers will work even if some mipmap levels are missing.

Anyway, I've added support for register combiners and you may edit the .ini file to enable it. It looks like this is working (except for the missing textures) which I find quite amazing since I have no hardware to test on, just wrote the code carefully reading the spec. http://www.opengl.org/discussion_boards/ubb/smile.gif

I've just uploaded new sources too (same link as above) if anyone wants to help debug it. If my theory about compressed textures being the problem I suppose changing from .dds to any other arbitrary image format would make it work. I guess the problem would the lie in the upload2DImage() function in TextureUtils.h, probably something with the last mipmap levels. I'm not all that sure how s3tc actually handles mipmap levels smaller than 4x4, but I suppose it takes a full 4x4 block anyway, at least is seams that's how it works in the .dds file format, so that's how I'm doing it, if it's supposed to be handled in another way I guess that's the problem.

Gorg
02-21-2002, 05:36 PM
Originally posted by Humus:
I know from earlier experiences that nVidia's drivers are more sensitive to inconsistent textures, they will produce all white while ATi's drivers will work even if some mipmap levels are missing.
.

Ati is wrong.Running your engine with the software implementation(impossible I know) the texture would also show white. Matt said that if all the mipmaps level are not present, the texture unit should act as if disabled. Ati should fix that.

This caused me some headache with the vanilla Radeon. I add my texture filtering set to use mipmapping and I did not create any mipmaps.

The radeon was happily showing the texture up close, but when mipmap level changed, it would display trash. Very nice!!

The software implementation act as if the texture uint was disabled and was showing the last primary color.

Humus
02-21-2002, 10:55 PM
Yeah, I know. It makes it harder to debug stuff when it's working even though it shouldn't because I have a bug.

Gorg
02-22-2002, 06:26 AM
Yes it's annoying. BTW, nice demo even though I would prefer it with more colors!! (I have a geforce 3). What about putting an option to remove mipmaps as a workaround.

[This message has been edited by Gorg (edited 02-22-2002).]

Eric
02-22-2002, 07:37 AM
Humus,

After fiddling with your source, I have finally managed to compile and run it on my machine. I'll have a look on Monday to see if I can fix this texture compression problem with GF3...

Regards.

Eric

P.S.: noticed some memory leaks when exiting the program ! http://www.opengl.org/discussion_boards/ubb/wink.gif

Humus
02-22-2002, 08:02 AM
Originally posted by Gorg:
Yes it's annoying. BTW, nice demo even though I would prefer it with more colors!! (I have a geforce 3). What about putting an option to remove mipmaps as a workaround.


Colors are coming http://www.opengl.org/discussion_boards/ubb/smile.gif I think I've solved the bug now, got imageSize parameter wrong. But ATi's drivers weren't exactly helping me finding that bug, they ignore this parameter all together. http://www.opengl.org/discussion_boards/ubb/rolleyes.gif

This code,
glCompressedTexImage2DARB(target, level++, internalFormat, w, h, 0, 23/*size*/, src);
works just as well as
glCompressedTexImage2DARB(target, level++, internalFormat, w, h, 0, size, src);

No errors generated ...

Oh well, the error was that I calculated the size parameter before shifting width and height one bit right, so I got the size of the previous mipmap level.

Humus
02-22-2002, 08:05 AM
Originally posted by Eric:
P.S.: noticed some memory leaks when exiting the program ! http://www.opengl.org/discussion_boards/ubb/wink.gif

How did you spot that, and do you have any info about where the leak is?

opla
02-22-2002, 08:11 AM
BoundChecker ?

Zak McKrakem
02-22-2002, 11:20 AM
Originally posted by Humus:
How did you spot that, and do you have any info about where the leak is?

Boundschecker is a good tool.

Anyway, in VS6 you can set some flags and you will have memory leak information when debugging from VS (in debug configuration).
Add next lines at the beginning of your code:
int flag = _CrtSetDbgFlag(_CRTDBG_REPORT_FLAG); // Get current flag
flag |= _CRTDBG_LEAK_CHECK_DF; // Turn on leak-checking bit
// flag |= _CRTDBG_CHECK_ALWAYS_DF; // Turn on CrtCheckMemory
_CrtSetDbgFlag(flag); // Set flag to the new value

(notice I have _CRTDBG_CHECK_ALWAYS_DF commented as it slows down the code too much. But, sometimes, it is good to test with it enabled).
You will have information about those flags in VS help.

Humus
02-22-2002, 12:02 PM
Cool, never knew that features existed! But it's kinda hard to know where the leak is ... the output isn't showing too much useful info, kinda hard to identify where that binary data comes from http://www.opengl.org/discussion_boards/ubb/smile.gif

Humus
02-22-2002, 01:14 PM
Got it confirmed that it works on GF3 now! It does look a lot brighter on the screenshot I saw though, such that the bumps got unbumped, not sure if it's real or fiction or just the JPEG compression.

tcobbs
02-22-2002, 03:56 PM
First off, it looks great! I have a GF3 (original).

Just as a note to others trying to run it on a GF3, though: it doesn't work at all if Anisotropic filtering is turned on in the nVidia control panel. It runs, and the music plays, but the graphics don't even approach correctness. Setting Aniso back to Disabled made everything work fine, though.

Gorg
02-22-2002, 05:00 PM
Finally colors!! Looks great.

JackM
02-22-2002, 06:07 PM
It runs fine, unless you have forced anisotropic filtering in registry. Good job!

FXO
02-22-2002, 06:45 PM
First off, Humus, your engine looks great!
How big are the 3D-textures you use?

The memoryleak checking in MSVC sounds cool, but I cant get it to work.

MSVC cant find any of the functions.
I have put the code Zak McKrakem posted in main.cpp.

Should I include some extra headers?

Thanks and good luck with the engine Humus!

richardve
02-22-2002, 10:43 PM
Holy mother of all cows!!

This is sweet! http://www.opengl.org/discussion_boards/ubb/smile.gif

In case you haven't seen it yet, I took 2 screenshots:

Anisotropic filtering disabled (http://www.richardve.f2s.com/11.png)
Anisotropic filtering enabled (2x) (http://www.richardve.f2s.com/12.png)

With anisotropic filtering disabled I'm getting between 130 and 30 FPS, but with anisotropic filtering enabled it's very slow (almost seconds per frame instead of frames per second)

Nutty
02-22-2002, 11:37 PM
WTF! That anisotropic enabled jpeg looks well shagged!

richardve
02-23-2002, 02:38 AM
You should see it moving http://www.opengl.org/discussion_boards/ubb/smile.gif

Anyway, I forgot to say that I'm using the 23.12 drivers.

I've heard that the 27.xx are much better so I'll try to run it with those ones in a few minutes.

Humus
02-23-2002, 03:48 AM
Originally posted by FXO:
First off, Humus, your engine looks great!
How big are the 3D-textures you use?

The memoryleak checking in MSVC sounds cool, but I cant get it to work.

MSVC cant find any of the functions.
I have put the code Zak McKrakem posted in main.cpp.

Should I include some extra headers?

Thanks and good luck with the engine Humus!

Thanks!
It still doesn't look like intended on GF3, the diffuse lit object looks twice as bright as on my Radeon 8500. I'm not sure which driver does it the right way, but after some short testing I've found that my card looks much closer to what I should expect, but I must look deeper into it. After changing RGB_SCALE I've found that I got some texture flashing, and after looking at the screenshots from GF3 it looks like there are texture flashing there too (upper right and upper left corner on richardve's non-aniso screenshot). Could be something wrong with my code.
The anisotropic error must be a driver bug though.

The 3d textures are 64 x 64 x 64.

About the memory leaks, I just cut'n'pasted the code into mine and it just worked, no extra headers, but it could of course be defined in some header I had included already. I was able to track the memory leak down btw, I had forgot to free a struct in my .png loading code.

[This message has been edited by Humus (edited 02-23-2002).]

richardve
02-23-2002, 03:53 AM
Well, 27.42 doesn't solve the problem.

Humus
02-23-2002, 12:00 PM
The extreme brightness problem on GF3 is solved now and the demo is updated. Now it should run and look equally well on GF3 as on Radeon 8500. I've added coronas too, with soft in and out of visibility.

richardve
02-23-2002, 12:19 PM
Yeah, looks way better now!

And it's a few frames faster too (~10)

Humus
02-24-2002, 05:10 AM
Yeah, but the performance increase is mostly because I made the light in the middle a little smaller, 800->700 http://www.opengl.org/discussion_boards/ubb/smile.gif, saves some fillrate, and looks a little better too.

ehart
02-25-2002, 06:55 AM
On the texture completeness problem, this is something we have fixed recently.

Separately, I wanted to address the issue of finding bugs. A couple of you seemed to state you knew the texture completeness bug was there. In the future, I encourage you to report these issues. This particular bug was simply a cut and paste error that was quickly fixed. I posted the info on how to report issues in the "Correct VAO usage" thread.

- Evan

Gorg
02-25-2002, 09:48 AM
Originally posted by ehart:

On the texture completeness problem, this is something we have fixed recently. - Evan



I had this problem 8 months ago!

I believe I sent an app, but I could not find the email, so most likely I forgot http://www.opengl.org/discussion_boards/ubb/frown.gif http://www.opengl.org/discussion_boards/ubb/smile.gif

Humus
02-25-2002, 10:33 AM
I had that problem too for quite a long time ago, or more correctly, I didn't have any problem, but others with nVidia cards did. At that time I didn't think of it as a driver bug, more of a "driver makes the best of the situation" thing, thus I didn't report it, but I suppose I should have.

Otherwise, I report all issues I experience when I feel pretty sure it's the driver and not my own code. If I'm unsure, I usually post a question at this forum instead. During this project I've reported several bugs and provided a number of apps illustrating the problems in question.