PDA

View Full Version : Memory texture



locualo
04-27-2005, 09:32 AM
We are doing a program that use textures and when it's executing, a white cube appears. It is possible that the problem is the size of texture, for this reason, we reduce the size of the texture but we don't know the size of the memory texture. How can we obtain the size of the memory texture of GPU? Thanks.

Aeluned
04-27-2005, 09:37 AM
This is most likely a simple error and not an
advanced topic.

if you are using GL_TEXTURE_2D your texture must
be a power of 2.

did you enable texture mapping via
glEnable(GL_TEXTURE_2D)?

did you properly create your texture object and
bind the texture prior to rendering your cube?

michagl
04-27-2005, 10:21 AM
i find the standard max texture size to be 512x512 for what its worth.

Aeluned
04-27-2005, 10:39 AM
If you want to find the implementation specific
restriction on texture size for your platform you
can call:

glGetIntegerv(GL_MAX_TEXTURE_SIZE);

zed
04-27-2005, 04:41 PM
Originally posted by michagl:
i find the standard max texture size to be 512x512 for what its worth.u mean 2048x2048 see here
http://www.delphi3d.net/hardware/listreports.php

michagl
04-27-2005, 05:43 PM
Originally posted by zed:

Originally posted by michagl:
i find the standard max texture size to be 512x512 for what its worth.u mean 2048x2048 see here
http://www.delphi3d.net/hardware/listreports.php this is awesome! i've asked for a database like this in these forums countless times... finally!

still, for my current card the database says my max texture size should be 4096x4096, but i swear, if i give it more than 512x512 i get a solid white texture.

i'm pretty sure same goes even for a single channel 8bit texture map... what gives here?

if i could get my card to do 4096x4096 i would be in heaven!!!

PS: thanks Tom Nuydens for the datababse! (name looked familiar) -michael

Relic
04-28-2005, 04:26 AM
Could well be that you haven't set a hardware accelerated pixelformat and got Microsoft's implementation? Query glGetString(GL_VENDOR).
Again use the glGetIntegerv(GL_MAX_TEXTURE_SIZE) query to test it _inside_ your app.
Make sure your minification filter is set correctly. It defaults to mipmapping, and if you don't download mipmaps then the texture is inconsistent and the tex unit bound to it is switched off.

michagl
04-28-2005, 08:26 AM
Originally posted by Relic:
Could well be that you haven't set a hardware accelerated pixelformat and got Microsoft's implementation? Query glGetString(GL_VENDOR).
Again use the glGetIntegerv(GL_MAX_TEXTURE_SIZE) query to test it _inside_ your app.
Make sure your minification filter is set correctly. It defaults to mipmapping, and if you don't download mipmaps then the texture is inconsistent and the tex unit bound to it is switched off.i'm looking into this today. i'm using the standard nvidia drivers. i use a win32 api function i believe to retrieve the pixelformat.

it couldn't have anything to do with mipmapping could it? maybe if mipmapping is not used you are limited to 512x512. is that possible? just a thought i had last night. i've been using mipmapping for about a year, but i've had the card a lot longer than that and have just taken for granted that 512 is the max texture size.

so i will try the queries now i guess.

michagl
04-28-2005, 08:36 AM
i get max texture equals 4096 and vendor is NVIDIA Corporation.

i'm sure though that i can't upload a texture any larger than 512, but i will try again just now.

michagl
04-28-2005, 08:48 AM
i get solid white for a 1024x1024 texture.

if i can get better than 512 i have to get to the bottom of this!... please help me out here everyone.

i believe there is a general performance hit in using larger textures... probably time spent loading the texture into 'resident' memory i believe it is called.

still for what its worth, my card says it does 4096, but i can't get better than 512.

any ideas?

i'm not really targeting any systems at textures that large, but it would be useful to be able to use larger textures to hack a screenshot together on short notice.

evanGLizr
04-28-2005, 12:25 PM
Originally posted by michagl:
i get solid white for a 1024x1024 texture.

if i can get better than 512 i have to get to the bottom of this!... please help me out here everyone.

i believe there is a general performance hit in using larger textures... probably time spent loading the texture into 'resident' memory i believe it is called.

still for what its worth, my card says it does 4096, but i can't get better than 512.

any ideas?
Try a 4096x1 texture. The maximum texture size is just that, a static maximum (normally limited by the size of the internal calculations of the texture addressing unit of the graphics chip).
In order to be able to create a texture, you need to take into account runtime considerations like the amount of memory available, etc (the driver may reject to create a texture if it cannot fit it completely - including mipmaps and expansion to RGBA - in video memory).

Check glGetError after calling glTexImage and you will know what the problem is.

michagl
04-28-2005, 01:19 PM
if it says my max texture size is 4096, shouldn't this mean i should be able to manage a 4096x4096 RGBA texture? or does this mean i can do a 4096x1x8bits or what???

can i get a 100% confident opinion please?

if it doesn't mean this, then wouldn't a GL_MAX_TEXTURE_ELEMENTS_RGBA or something be more useful?

yooyo
04-28-2005, 01:46 PM
Yes... you can use 4096x4096 RGBA texture but it takes 64MB. If you have mipmaps it could be ~85.33MB.

yooyo

michagl
04-28-2005, 04:05 PM
i'm just trying to get 1024x1024, thats not even half of a megabyte.

i can get all the 512x512 maps i want, but i can't get a 1024x1024 even if it is more or less the only video memory i have allocated.

sqrt[-1]
04-28-2005, 08:56 PM
You must be doing something wrong. I have been using 1024x1024 texture for years. (Gerforce 3/4 days?) I have also used 2048x1024 with no problems..

def
04-29-2005, 12:06 AM
A solid white texture should indicate a driver problem if the driver reports higher possible texture resolutions.
Framebuffer limitations should not be the problem either.
I am using 8 4096x4096 textures via multitexturing, with texture compression it runs smoothly, without compression it still runs, sloooowly because textures are being used from main memory, but still, it displays correctly.
(using GeForceFX5900)

I had the same effect (solid white texture) once while reaching the texture limit on my current card with some textures. After switching drivers everything was ok.

Relic
04-29-2005, 01:16 AM
michagl, do other people's programs show the same problem on your system?
If your reproducer is not complex, show the whole code.
Check glGetError after the glTexImage2D to see if it succeeded.
Newer drivers is always a good advice.

More involved checks:
AGP aperture size in the system BIOS.
glGetString(GL_VERSION) must contain AGP on an AGP board. If you have PCI there your motherboard chipset drivers are not correctly installed. (huge performance loss and texture space limited to video mem).

michagl
04-29-2005, 01:37 PM
thanks everyone, i'm very excited to learn that i can use larger textures... just have to figure out why i can't!

ummm, if someone has a quick demo binary on hand that uses larger textures i would like to give it a try.

i've always been limited to about 512 since my last 400$ graphics card even... i don't think its an agp issue, because i've had agp issues before and they are crippling. i get good performance, about as good as any game on the same hardware, just can't use larger textures.

i'm running a student site liscensed copy of win2k professional which the local university gives to students and staff free of charge. that couldn't be an issue could it?

i can't really think of much else.

i can pop in on my bios and see if i have any agp options, but i seriously doubt that will go far.

on this machine i've changed out the motherboard/processor probably three times, graphics card 2 or 3 times, reinstalled the os countless times, and i've never seen a change in that fact... well to be honest i'd more or less taken it for a given, so its not like i try to upload >512 on any real basis.

i've been running a secondary riva tnt pci card, but that shouldn't have any issues.

my drivers throughout the ages have generally been the sanctioned nvidia.com drivers, and the occasional detonator drivers or whatever.

i've never actually been able to build one of my massive projects in linux (horrible development environment) ... and i've never tried throughing together a glx app to test the textures. i had just thought 512 was the real-time limit all these years. this is all quite shocking to me.

sincerely,

michael
sincerely,

michagl
04-29-2005, 01:47 PM
i have a few AGP options in the bios:

aperture size (same as my cards ram)

driving control??? (in auto mode)

fast write (was disabled, i enabled it)

master 1 ws write (disabled)
master 1 ws read (disabled)

cpu to agp post write (enabled)
delay transaction (enabled)

non of these have descriptions in the bios... i think i can see the manual for this board burried over there... i may look them up.

i don't expect any of these to fix my texture problem... but if anyone feels like giving me the skinny on the pros and cons of these options that would be cool.

some of the enabled options read like they are counter productive. i may disable them just for the hell of it unless anyone cares to issue some warnings.

i generally don't monkey with this stuff unless the effect is obvious. i'm not really a glutton for performance, but restricted texture size is a functional constraint, so i would like to fix this if possible.

aren't 512x512 textures about ideal size for state switching... you don't want them too big right, especially if you swap them more oft than not.

sincerely,

michael

def
04-30-2005, 05:04 AM
Try NeHe's lesson 6. The texture loading code is resolution independant (just needs to be power of two) so all you need to do is exchange the bmp or resize it to test for different resolutions.

http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=06

michagl
04-30-2005, 10:37 AM
Originally posted by def:
Try NeHe's lesson 6. The texture loading code is resolution independant (just needs to be power of two) so all you need to do is exchange the bmp or resize it to test for different resolutions.

http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=06 i can do a 4096x4096 texture with the demo even if i rebuild it!

any ideas why this won't work with a realistic application? i'm not uploading what i would think would be a lot of textures. just maybe 6~10 512x512 textures at any one time i would guess. if i try to upload a 1024x1024 texture at any point you can bet it will come up all white.

what could be causing this given that i could run the nehe demo with 4096 textures even if i rebuilt it. as far as i can tell my context instantiation and texture upload process are identical to the nehe demo as well.

sincerely,

michael

michagl
04-30-2005, 05:06 PM
Originally posted by Relic:
michagl, do other people's programs show the same problem on your system?
If your reproducer is not complex, show the whole code.
Check glGetError after the glTexImage2D to see if it succeeded.
Newer drivers is always a good advice.

More involved checks:
AGP aperture size in the system BIOS.
glGetString(GL_VERSION) must contain AGP on an AGP board. If you have PCI there your motherboard chipset drivers are not correctly installed. (huge performance loss and texture space limited to video mem).for the record, glGetError is not flagged after the failed texture is uploaded, and glGetString(GL_VERSION) returns "1.5.2" ... no AGP PCI or anything... i'm assuming everything must be running in full AGP8x though. my machine runs other peoples demos at their reported frame rates on similar hardware.

def
05-01-2005, 01:20 AM
Since you seem absolutely positive your OpenGL code is not the problem, check your texture loading code.
You could check your textures with glGetTexImage() to see if the problem occurs during rendering or before.

michagl
05-01-2005, 12:37 PM
oiiii this one was a bugger...

i did a lot of testing and tracked the problem back to an old routine that checks whether or not a texture is power 2.

as it also happens there was a constant i was not aware of 'MAX_TEXTURE_SIZE' that it also checks against.

i can't reamember where i originally picked up the beginnings of what would eventually become my texture management system some 4 years ago... but i've totally overhauled it in the mean time, and this little constant and this seemingly benign looking function which i presumed only tested for pow2 to caused the whole thing. turns out the only time i actually used the function was right before uploading a texture... there was no assertion break because i had modified the texture system to replace its error handling, and in haste i had just commented out the old error handling in that bit leaving only a return out of the function. i thought this was harmless and i would just get a white texture if somehow i had used a non power two... but like i said the function also rejected on the burried MAX_TEXTURE_SIZE constant... that being the only escape i had never bothererd tracing into the upload routine.

anyhow, as dissappointed with myself as i am for still having this old textbook code in my system due to neglect... i'm also very excited to be able to use comparably massive textures! so please don't spoil my party with boo~hoos.

sincerely,

michael

Relic
05-02-2005, 12:16 AM
glGetString(GL_VERSION) returns "1.5.2" ... no AGP PCI or anything...
Oops, my bad, it's under GL_RENDERER.

(The single step debugger is your friend. ;) )