Memory texture

We are doing a program that use textures and when it’s executing, a white cube appears. It is possible that the problem is the size of texture, for this reason, we reduce the size of the texture but we don’t know the size of the memory texture. How can we obtain the size of the memory texture of GPU? Thanks.

This is most likely a simple error and not an
advanced topic.

if you are using GL_TEXTURE_2D your texture must
be a power of 2.

did you enable texture mapping via
glEnable(GL_TEXTURE_2D)?

did you properly create your texture object and
bind the texture prior to rendering your cube?

i find the standard max texture size to be 512x512 for what its worth.

If you want to find the implementation specific
restriction on texture size for your platform you
can call:

glGetIntegerv(GL_MAX_TEXTURE_SIZE);

Originally posted by michagl:
i find the standard max texture size to be 512x512 for what its worth.
u mean 2048x2048 see here
http://www.delphi3d.net/hardware/listreports.php

Originally posted by zed:
[quote]Originally posted by michagl:
i find the standard max texture size to be 512x512 for what its worth.
u mean 2048x2048 see here
http://www.delphi3d.net/hardware/listreports.php
[/QUOTE]this is awesome! i’ve asked for a database like this in these forums countless times… finally!

still, for my current card the database says my max texture size should be 4096x4096, but i swear, if i give it more than 512x512 i get a solid white texture.

i’m pretty sure same goes even for a single channel 8bit texture map… what gives here?

if i could get my card to do 4096x4096 i would be in heaven!!!

PS: thanks Tom Nuydens for the datababse! (name looked familiar) -michael

Could well be that you haven’t set a hardware accelerated pixelformat and got Microsoft’s implementation? Query glGetString(GL_VENDOR).
Again use the glGetIntegerv(GL_MAX_TEXTURE_SIZE) query to test it inside your app.
Make sure your minification filter is set correctly. It defaults to mipmapping, and if you don’t download mipmaps then the texture is inconsistent and the tex unit bound to it is switched off.

Originally posted by Relic:
Could well be that you haven’t set a hardware accelerated pixelformat and got Microsoft’s implementation? Query glGetString(GL_VENDOR).
Again use the glGetIntegerv(GL_MAX_TEXTURE_SIZE) query to test it inside your app.
Make sure your minification filter is set correctly. It defaults to mipmapping, and if you don’t download mipmaps then the texture is inconsistent and the tex unit bound to it is switched off.

i’m looking into this today. i’m using the standard nvidia drivers. i use a win32 api function i believe to retrieve the pixelformat.

it couldn’t have anything to do with mipmapping could it? maybe if mipmapping is not used you are limited to 512x512. is that possible? just a thought i had last night. i’ve been using mipmapping for about a year, but i’ve had the card a lot longer than that and have just taken for granted that 512 is the max texture size.

so i will try the queries now i guess.

i get max texture equals 4096 and vendor is NVIDIA Corporation.

i’m sure though that i can’t upload a texture any larger than 512, but i will try again just now.

i get solid white for a 1024x1024 texture.

if i can get better than 512 i have to get to the bottom of this!.. please help me out here everyone.

i believe there is a general performance hit in using larger textures… probably time spent loading the texture into ‘resident’ memory i believe it is called.

still for what its worth, my card says it does 4096, but i can’t get better than 512.

any ideas?

i’m not really targeting any systems at textures that large, but it would be useful to be able to use larger textures to hack a screenshot together on short notice.

Originally posted by michagl:
[b]i get solid white for a 1024x1024 texture.

if i can get better than 512 i have to get to the bottom of this!.. please help me out here everyone.

i believe there is a general performance hit in using larger textures… probably time spent loading the texture into ‘resident’ memory i believe it is called.

still for what its worth, my card says it does 4096, but i can’t get better than 512.

any ideas?
[/b]
Try a 4096x1 texture. The maximum texture size is just that, a static maximum (normally limited by the size of the internal calculations of the texture addressing unit of the graphics chip).
In order to be able to create a texture, you need to take into account runtime considerations like the amount of memory available, etc (the driver may reject to create a texture if it cannot fit it completely - including mipmaps and expansion to RGBA - in video memory).

Check glGetError after calling glTexImage and you will know what the problem is.

if it says my max texture size is 4096, shouldn’t this mean i should be able to manage a 4096x4096 RGBA texture? or does this mean i can do a 4096x1x8bits or what???

can i get a 100% confident opinion please?

if it doesn’t mean this, then wouldn’t a GL_MAX_TEXTURE_ELEMENTS_RGBA or something be more useful?

Yes… you can use 4096x4096 RGBA texture but it takes 64MB. If you have mipmaps it could be ~85.33MB.

yooyo

i’m just trying to get 1024x1024, thats not even half of a megabyte.

i can get all the 512x512 maps i want, but i can’t get a 1024x1024 even if it is more or less the only video memory i have allocated.

You must be doing something wrong. I have been using 1024x1024 texture for years. (Gerforce 3/4 days?) I have also used 2048x1024 with no problems…

A solid white texture should indicate a driver problem if the driver reports higher possible texture resolutions.
Framebuffer limitations should not be the problem either.
I am using 8 4096x4096 textures via multitexturing, with texture compression it runs smoothly, without compression it still runs, sloooowly because textures are being used from main memory, but still, it displays correctly.
(using GeForceFX5900)

I had the same effect (solid white texture) once while reaching the texture limit on my current card with some textures. After switching drivers everything was ok.

michagl, do other people’s programs show the same problem on your system?
If your reproducer is not complex, show the whole code.
Check glGetError after the glTexImage2D to see if it succeeded.
Newer drivers is always a good advice.

More involved checks:
AGP aperture size in the system BIOS.
glGetString(GL_VERSION) must contain AGP on an AGP board. If you have PCI there your motherboard chipset drivers are not correctly installed. (huge performance loss and texture space limited to video mem).

thanks everyone, i’m very excited to learn that i can use larger textures… just have to figure out why i can’t!

ummm, if someone has a quick demo binary on hand that uses larger textures i would like to give it a try.

i’ve always been limited to about 512 since my last 400$ graphics card even… i don’t think its an agp issue, because i’ve had agp issues before and they are crippling. i get good performance, about as good as any game on the same hardware, just can’t use larger textures.

i’m running a student site liscensed copy of win2k professional which the local university gives to students and staff free of charge. that couldn’t be an issue could it?

i can’t really think of much else.

i can pop in on my bios and see if i have any agp options, but i seriously doubt that will go far.

on this machine i’ve changed out the motherboard/processor probably three times, graphics card 2 or 3 times, reinstalled the os countless times, and i’ve never seen a change in that fact… well to be honest i’d more or less taken it for a given, so its not like i try to upload >512 on any real basis.

i’ve been running a secondary riva tnt pci card, but that shouldn’t have any issues.

my drivers throughout the ages have generally been the sanctioned nvidia.com drivers, and the occasional detonator drivers or whatever.

i’ve never actually been able to build one of my massive projects in linux (horrible development environment) … and i’ve never tried throughing together a glx app to test the textures. i had just thought 512 was the real-time limit all these years. this is all quite shocking to me.

sincerely,

michael
sincerely,

i have a few AGP options in the bios:

aperture size (same as my cards ram)

driving control??? (in auto mode)

fast write (was disabled, i enabled it)

master 1 ws write (disabled)
master 1 ws read (disabled)

cpu to agp post write (enabled)
delay transaction (enabled)

non of these have descriptions in the bios… i think i can see the manual for this board burried over there… i may look them up.

i don’t expect any of these to fix my texture problem… but if anyone feels like giving me the skinny on the pros and cons of these options that would be cool.

some of the enabled options read like they are counter productive. i may disable them just for the hell of it unless anyone cares to issue some warnings.

i generally don’t monkey with this stuff unless the effect is obvious. i’m not really a glutton for performance, but restricted texture size is a functional constraint, so i would like to fix this if possible.

aren’t 512x512 textures about ideal size for state switching… you don’t want them too big right, especially if you swap them more oft than not.

sincerely,

michael

Try NeHe’s lesson 6. The texture loading code is resolution independant (just needs to be power of two) so all you need to do is exchange the bmp or resize it to test for different resolutions.

http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=06