Memory managment?

What percent of a Video Cards memory should be used by textures? In order to leave room for the depth buffer, and vertex information? I mean if you have a 8 meg video card, how many megs should be dedicated to texture use?? Or if you have a 64 meg vid card, what percent of that one should be used for textures/?

I think it depends of your app, you have to compute the size of your back & front buffer + z buffer and maybe stencil : for example if those 4 buffer have a 16bits per pixel precision and with = 800x600 resolution then it use 3,84 Mo. So with a 8Mo video card it give you 4Mo for texture. Over those 4Mo other texture are in aperture, a memory allocated in the RAM to download quickly texture in memory via AGP. Aperture can be parameterized.

Ok then does this sound right?

If i have a 1024768 window, and i am using 32Bit buffers. I have 3 buffers in use, the front buffer, the back buffer, and the Depth buffer. I would have (1024768)329;

Because I have 1024*768 pixels = 786432 Pixels. 32Bits per pixel. And 9 32Bit values for each pixel (2 sets of RGBA (front back buffers taking 4 values a piece per pixel) and 1 set of 32bit float values for the depth buffer) Making a grand total of
226492416Bits for the buffers all togeather?

no, the format that you describe is 32bit per color component (red green blue alpha) a 128 bit per pixel

i think when you talk about RGBA 32 bits per pixels, you mean
8 bits for red component
8 bits for green component
8 bits for blue component
8 bits for alpha component

and not 32 bits for red component etc…

so back buffer + front buffer =
(1024 * 768 * 32) * 2

  • 32 bit z buffer

  • (1024 * 768 * 32)

finaly : 75 497 472 MBits
also 9 437 184 Mbytes

Ok, one more quick question. How can i choose my depth buffer value? Like if wanted a 16bit, or a 32bit value. Is there a way i can set this value, or is it just created without me being able to set it??

It depends, if you use OpenGL under windows without glut, you can specifie the zbuffer precision in the PIXELFORMATDESCRIPTOR structure set the member cDepthBits to 16 or 32. With glut i don t know the way to change precision but to know which precision is used you can call
GLint bitPrecision;
glGetIntegerv(GL_DEPTH_BITS,&bitPrecision)

result is in bitPrecision.

you can set it in the PIXELFORMATDESCRIPTOR