Do I have any way to specify where I want to download textures? Now I’m using
gluBuild2DMipmaps, but I don’t know where they’re stored (System, AGP, or Video memory).
I really don’t want to use AGP memory, since it’s slower than vidmem…
With OpenGL you cannot specify where your textures go. The driver will handle that for you.
Be sure to use glGenTextures and glBindTexture and you’re ok. You might want to sort your geometry on texture (so that you’ll have the least number of glBindTexure calls), but that totally depends on what you’re doing.
OpenGL’s texture management is completely opaque to the developer. On consumer-class PC hardware, it goes like this:
• You load textures into system memory
• OpenGL copies as many of them as possible to video memory
• When rendering, textures that aren’t in video memory yet are transferred over from system memory
• The least recently used textures are thrown out of video memory to make room for the new ones
I don’t know how current drivers use AGP memory in all this, but I suspect that it would be used instead of system memory to keep the backup copies of the textures. I also don’t know what happens to dynamically created textures (glCopyTexImage2D()). If anyone knows, please do tell
I don’t think there’s anything you can do to change the way your drivers manage all this… Although you could always try storing textures in memory that you get with wglAllocateMemoryNV()
For NVIDIA hardware, in release10 drivers and beyond, glCopyTex{Sub}Image2D() only updates the on-card texture – if the formats match closely enough for a “fast copy”. If the texture ever has to be evicted, you pay for readback then. That way you may never have to pay for readback.
Originally posted by zed: you might wanna try a hint with glPrioritizeTextures(…) but i dont think drivers take head of this to much
glPrioritizeTextures(…) is the way to go.
Higher priority textures are more likely to remain in video memory
USE THIS
why
well think about it
Iif you could controll which textures went into vid mem you…
-would have to adjust the code to suite thefact that each card has different mem on board
- Use LRU paging algorithem or something similar to dump textures out of vid mem
-priorities textures to make the page more sticky in the LRU algorithem
-prioritize textures to get them sorted in order of which ones get vmem when available.
This is what glPrioritizeTextures(…) does.
dont reinvent the wheel!
- OpenGL is a software abstraction to the hardware
- Directly controlling hardware, ie knowing what texture is in what memory is against
the development phylosophy of OpenGL
basicy
-asign priorities to your textures
-spend time finding other optimizations
Originally posted by cass: For NVIDIA hardware, in release10 drivers and beyond, glCopyTex{Sub}Image2D() only updates the on-card texture – if the formats match closely enough for a “fast copy”. If the texture ever has to be evicted, you pay for readback then. That way you may never have to pay for readback.
Are you implying that you read texture data back from the card when you need to evict a texture? That sounds slow. I had been assuming that you’d store a backup copy of the texture in system RAM (maybe even pageable RAM). That way, on machines with lots of memory (like mine) there is no “penalty” except locking the backup memory and re-uploading it.
Originally posted by jwatte:
[b] Are you implying that you read texture data back from the card when you need to evict a texture? That sounds slow. I had been assuming that you’d store a backup copy of the texture in system RAM (maybe even pageable RAM). That way, on machines with lots of memory (like mine) there is no “penalty” except locking the backup memory and re-uploading it.
For NVIDIA hardware, in release10 drivers and beyond, glCopyTex{Sub}Image2D() only updates the on-card texture – if the formats match closely enough for a “fast copy”. If the texture ever has to be evicted, you pay for readback then. That way you may never have to pay for readback.
[/b]
Hmm, do you really do that? There was a lengthy discussion on ogl-gamedev some time ago that concluded in “It’s not possible”, because bluescreens like ‘CD Missing’ can clobber vidmem without telling you. What do you do there?