PDA

View Full Version : large 3Dtexture do not generate GL_OUT_OF_MEMORY on NVidia cards



COZE
06-07-2005, 04:03 AM
Hello everybody.
I some time ago ask for this problem but I did not solve it till now.
So I try to ask you for help again, how you solve it.
On Nvidia cards ( from GF4 TI, through FX, till GF 6600 6800 I have check)
if you use large 3D texture (example 512*512*256 12bit on 128M on board ) proxy texture says Ok I can do It for you, glTexImage3D do not generate GL_OUT_OF_MEMORY. I understand that this texture is too large to fit in memory but how I can handle this problem?
One possibility the hardest way is to calculate texture size and check onboard memory of card and…
Some times is GL_OUT_OF_MEMORY generated for example by next call of glProgramStringARB I understand why, but is hard to wait and expect till some error is generated by next command and…
If you on 128MB card generate 12bit 512^3 texture it return error but 512*512*256 do not.
Do you have somebody inspiration how to handle this?

Relic
06-07-2005, 07:08 AM
12 bits are not natively supported. Check the texture format table on developer.nvidia.com.
512*512*256 will be stored in 16 bit, so this is 128MB. This doesn't need to be loaded into video memory. That's what AGP and PCI Express memory is all about.
If you have enough AGP aperture set in the system BIOS or a PCI Express system, all should very well handle this situation and download and render with it, as long as you don't exceed the GL_MAX_3D_TEXTURE_SIZE.

Zengar
06-07-2005, 07:30 AM
Fist of all, I don't know of 12bit targets that are supported natively by nvidia. 12but formats will be converted to 8bit.

And 512*512*256*1byte is 64Mbyte, so where's he problem? Such texture would fit without probs. If you use mipmaps, this will be about 83Mb

Relic
06-07-2005, 09:56 PM
Fist of all, I don't know of 12bit targets that are supported natively by nvidia. 12but formats will be converted to 8bit.
See http://developer.nvidia.com/object/nv_ogl_texture_formats.html
Depends on the graphics chip. GeForce FX and up will convert to 16 bit.

COZE
06-08-2005, 01:16 AM
Thanks for reply
I thing the problem is not texture internal representation. I see that it is in precision that I need and that’s all. I need to catch situation when I have pBuffer, normal frame buffer, some Lookup texture and large volume texture. I need to know if is possible work with that large texture or if I need to divide it in to smaller.
So probably what write Relic can be problem. But my Mother board GA-8IPE1000 in bios do not have any feature to set something with AGP setting. I'm going to find some old computer with AGP setting and I will try if this is problem.

COZE
06-14-2005, 12:14 AM
No way I found old computer with setupable AGP aperture size. it makes me the same trouble. Im using at the same time in the FS program 3D texture 512*512*256*2 = 128MB, one lookup texture 1 or 2 dimensional with max size 2^16 * RGBA8, and PBuffer 512^2 so it can not fit to the memory.Or it check if texture fit in the memory and dont care about framebuffers and other stuff which should be in the memory at the same time?