View Full Version : glGetIntegerv(GL_MAX_TEXTURE_UNITS,...) is lying

02-27-2004, 07:30 AM
why is it that my call to glGetIntegerv(GL_MAX_TEXTURE_UNITS,...) returns a value of 4, when my GeForceFX 5600 supports 16 textures per pass?

02-27-2004, 10:16 AM
It tells you the truth. The GeforceFX-Line will not expose all it's pipelines under the fixed function. To access all it's texture passes, you'll have to access them in a fragment program/shader.