Using excessive texture

My understanding is that if I send more texture to OpenGL than will fit in texture memory, OpenGL is supposed to thrash the textures in and out of main memory automatically. Correct?

I have two counterexamples, and I’m wondering whether these are common driver bugs:

  1. On a machine with a GeForceMX with outdated OpenGL drivers, my program crashes when I send it a lot of texture;
  2. On all my Mac platforms, I get garbage textures displayed when I send them very large textures.
    In both cases, I’ve ensured that I’m not sending textures larger than 2048 in either dimension (the reported max for each machine).

Of course, it’s possible that I have bugs but I doubt it, because this code works when I send smaller textures to the pipeline.

So I guess my question is, historically, how robust have OpenGL drivers been at dealing with large amounts of texture?

Explain more what you have in 1).

Originally posted by jide:
Explain more what you have in 1).
We have two nearly identical machines with GeForce2 MX AGP cards in them. One has 64MB of video memory, and drivers from 2001 (5.13.01.1520). The other has 32MB of video memory, and drivers from 2003 (6.14.10.5216). The current drivers from NVIDIA for these cards are circa 2006.

On the 64MB machine with older drivers, everything works great. On the 32MB machine with newer drivers, I get a crash in glArrayElement (segV on write) when there is a lot of texture.

Interestingly, using a proxy to determine if the textures will “fit” doesn’t help on these machines; they always say the textures will “fit”, presumably because it could bring them in over the AGP bus.

(On Mac OS X, I’ve found a discussion which indicates that paging texture in is broken on that platform, which explains the results I’ve seen there.)

Are there any difference elsewhere like the amount of RAM, the AGP aperture size ? If not, it seems to be a driver bug, which most unfortunately will not be resolved. 5216 should be well for geforce 2.

notes: did you ensure the texture size fits what GL can support (GL_MAX_TEXTURE_SIZE) ?

Originally posted by jide:
[b] Are there any difference elsewhere like the amount of RAM, the AGP aperture size ? If not, it seems to be a driver bug, which most unfortunately will not be resolved. 5216 should be well for geforce 2.

notes: did you ensure the texture size fits what GL can support (GL_MAX_TEXTURE_SIZE) ? [/b]
I don’t know what an AGP aperture is, but the RAM is the same. I’m pretty convinced it’s a driver bug. I was wondering how common these kinds of bugs are…

(Yes, we’re testing GL_MAX_TEXTURE_SIZE, and are, in fact, scaling the textures down from 4096^2 to 2048^2 on that card.)

Take a look at the BIOS, it’s where you define the AGP aperture size: it should be enough (generally about twice the VRAM or half the RAM). But I can’t ensure this is what makes such an error.

Notes: (again) Well you say you ensured your textures fit the max size supported by GL. Do you use some special stuff like threading ? If not, it’s really strange to my point of view. But maybe other people will tell you what the problem really is.