Pixel buffer sizes

Does anyone know (or feel the inclination to test for me) the maximum pixel buffer sizes that can be allocated on various hardware? Using a geforce ti4600 I seem to max out at 2048x2048 pretty much regardless of actual pixelbuffer format.
I’m especially curious if the ‘workstation class’ cards like the Wildcat series can handle higher resolution pbuffers, also the new geforce and radeon cards.

TIA,
Wintal

Presumably, the pixel buffer size is, at the very least, limitted by the maximum texture size. And, if it isn’t limitted by that, it’s limitted by on-board memory. A 2048x2048x32bpp takes up 16MB. A 4096x4096x32bpp takes 64MB. That’s quite a bit to be asking for a buffer.

“Presumably, the pixel buffer size is, at the very least, limitted by the maximum texture size. And, if it isn’t limitted by that, it’s limitted by on-board memory. A 2048x2048x32bpp takes up 16MB. A 4096x4096x32bpp takes 64MB. That’s quite a bit to be asking for a buffer.”

It’s actually more than that, since you have a z-buffer too, but this card has 128MB. Taking out 2 1024x768x32bpp buffers for my screen, theres still plenty of ram left. I’m not aware of texture size limits, what are they?

This is also why I’m curious about the professional cards - they have a lot more video ram.

Wintal

Sizes of pbuffers are not restricted to power-of-two values - so you can determine the limits of your graphics hardware much more precisely.

It’s actually more than that, since you have a z-buffer too, but this card has 128MB.

Um, you are aware, then, that the aforementioned sizes are now doubled? The 2048 one costs 32MB, and the 4096 one costs 128MB. You’re certainly not going to get a 4096x4096 on any consumer card avaliable today.

I’m not aware of texture size limits, what are they?

GeForce3+ can handle 4096x4096. ATi Radeon’s can handle 2048x2048.

This is also why I’m curious about the professional cards - they have a lot more video ram.

True, but they will probably impose limits of their own.

I’d do what Flo said and find the precise limits yourself.

BTW, I have to ask: why do you need such large pixel-buffers? You realize that this size is coming out of your texture space, right? After all, if you had a 96MB pixel buffer, your card would act like it only had 32MB of video RAM in it. Because the pixel buffer is a render surface, it can’t page it out to AGP memory, so it all has to be there at once.

Um, you are aware, then, that the aforementioned sizes are now doubled? The 2048 one costs 32MB, and the 4096 one costs 128MB. You’re certainly not going to get a 4096x4096 on any consumer card avaliable today.

Depends on things like depth of z-buffer, I’ve experimented with it a fair bit. I’m not convinced it’s based on memory usage - switching to 16bpp seems to make no difference, for example.

True, but they will probably impose limits of their own.

Yeah, but what are they? My problem is that I’m working entirely on guesswork. I would think that large maximum pbuffer sizes are the type of things professional workstation cards are likely to do, but I really don’t know.

I’d do what Flo said and find the precise limits yourself.

It’s a nice idea, but I’m having trouble justifying that management buy me a $3k graphics card just to find out if I can allocate a larger buffer or not. In a similar line of reasoning, I’d rather not go the workstation card if, say, a geforce FX or radeon will do as good a job for me.

BTW, I have to ask: why do you need such large pixel-buffers?

My rendering technique requires compositing of multiple full window renders into a single output render. For reasons relating to state management, it’s almost impossible for me to use multiple pbuffers to render these views (or to composite them one at a time etc…), so they all get rendered into a single pbuffer (with different viewports into it). With a maximum pbuffer size of 2048x2048, a screen res of 1600x1200, and 8 renders being composited I’m losing a lot of resolution. Even with a modest res of 1024x768 I’m losing a third of my vertical res.

You realize that this size is coming out of your texture space, right? After all, if you had a 96MB pixel buffer, your card would act like it only had 32MB of video RAM in it. Because the pixel buffer is a render surface, it can’t page it out to AGP memory, so it all has to be there at once.

Yes, I’m aware of all this. Through a nice quirk of fate my app uses a grand total of zero textures (unless you count the one I create with the pbuffer). I’ve found that a lot of scientific visualisation apps are actually very light on the textures.

Wintal…

Originally posted by wintal:
My rendering technique requires compositing of multiple full window renders into a single output render. For reasons relating to state management, it’s almost impossible for me to use multiple pbuffers to render these views (or to composite them one at a time etc…), so they all get rendered into a single pbuffer (with different viewports into it). With a maximum pbuffer size of 2048x2048, a screen res of 1600x1200, and 8 renders being composited I’m losing a lot of resolution. Even with a modest res of 1024x768 I’m losing a third of my vertical res.

What kind of display device are you using/or what are you doing with the rendered data?

What kind of display device are you using/or what are you doing with the rendered data?

It’s an autostereoscopic 3d display.