You can use GL_PROXY_TEXTURE_3D to see if it fits. It takes different settings into consideration, like screen resoultion ( I think ). Did you try uploading using GL_UNSIGNED_BYTE just in case ?
Thanks PH. I just now tried uploading as unsigned byte and got the same result (no image). I will try GL_PROXY_TEXTURE_3D, though I don’t think it “fitting” could be the problem…
Update: I just tried proxy texture and queried the depth value afterwords…it says 256, so I guess it thinks it worked? At any rate, it doesn’t work when I give it actual data.
– Zeno
[This message has been edited by Zeno (edited 08-20-2002).]
Originally posted by PH: Hmm, you could also try using GL_RBGA8 as the internal format. Does that change anything ?
Hmm. GL_RGBA8 was no different, but your idea led me to try GL_RGBA4 for the internal format, which does work (I can see my dataset correctly). What is up with this? I need that precision and there’s no reason (memory-wise) that it shouldn’t work…
It almost seems like it doesn’t fit. The proxy mechanism is supposed to take the different texture parameters ( filtering, etc )into consideration and say ‘go’ if the operation would succeed ( using glGet for the maximum texture sizes is not really safe ).
I dont know what NEAREST filtering is suppose to accomplish, but not enough memory (or there is enough but the card doesn’t want it) is probably bang on. Turn off mipmaping if you have that on.
512 x 512 x 512 doesn’t mean much if dont mention the bpp
Proxy is the best test to see if it fails or passes. I guess GL doesn’t raise an error for unsupported texture dimensions. It would be better if it did instead of using proxy textures.
Perhaps this is heresy, but have you tried volume texture compression? If you’re doing medical imaging, the artefacts are likely to be too bad, but if this is for some snazzy volvis demo, it just might be the ticket. Cuts down on fetch rate, too!
512 x 512 x 512 doesn’t mean much if dont mention the bpp
I mentioned that the particular volume that is failing is 256^3 at 32 bits per pixel on a 128 MB card. There should be plenty of memory for this since it’s the only texture I’m using right now.
Proxy is the best test to see if it fails or passes
Apparently not. Proxy returns the correct values (I assume it returns garbage or 0 for width, height, and depth if it fails?) for width, height, and depth, yet my texture is not there when I use the whole volume unless I specify GL_RGBA4 as the internal format.
Perhaps this is heresy, but have you tried volume texture compression?
No, not yet, so I don’t know how bad the artifacts would be. I assume they’re as bad as regular texture compression, which is pretty ugly. At any rate, I want to repeat again that this should not be necessary, as the card has TWICE the RAM of the dataset that I’d like to display.
Thanks for all the comments guys . Hopefully I’ll get this figured out. I wish Matt or Cass would reply to comment on whether it could be a driver issue.
– Zeno
[This message has been edited by Zeno (edited 08-20-2002).]
Yea, the problem is that reporting bugs to nvidia (at least with my experience in the past) is best done here if you want a response . Also, it’s a big pain in the butt because I can’t just send them what I’m working on.
Here is some more info I got on the problem: 128x256x512 doesn’t work either…I haven’t tried anything larger. GL_NEAREST filtering didn’t help anything.
I tried downloading Klaus Engel’s volvis program from here http://wwwvis.informatik.uni-stuttgart.de/~engel/pre-integrated/ and loading a 256^3 dataset – it had almost the same results as mine (didn’t work), but I could swear that I played with all of his datasets back when 3d textures were first enabled for the Geforce3.
So I am really starting to believe it’s a driver bug. I’ll report it soon if we run out of ideas here.
Some new findings - I was able to use my program to load the 256^3 dataset at home on my Geforce3 (64 MB RAM) with 29.42 drivers. I installed those drivers at work, but still no luck . I guess it’s a Geforce4 problem now…