PDA

View Full Version : A way to blend in 16-bit/channel modes ?



Andru
06-27-2003, 03:55 AM
Hello,

I've been pulling my hair out with all this testing... tried several modes and doesn't seem to work. Here's the problem:

Running on a Radeon 9800 Pro, I need a pixel buffer format with 16 bits/channel and blending enabled.

The blending doesn't need to be implemented using alpha channels, even a 2-colour (R16G16) format would be ok. Preferable R16G16B16 though.

Is there a way ?

Thanks for any input, I'm getting desperate here...

Andru

jwatte
06-27-2003, 06:51 AM
You have to render to framebuffer, make framebuffer into texture, and then render into new/re-used framebuffer, using the texture as one input.

For "framebuffer" read "back buffer" or "pbuffer" or "back buffer of pbuffer" or "texture render target" or whatever you can whip up.

This means that individual triangles within the same geometry won't blend with each other.

Andru
06-27-2003, 09:52 PM
Originally posted by jwatte:
You have to render to framebuffer, make framebuffer into texture, and then render into new/re-used framebuffer, using the texture as one input.

This means that individual triangles within the same geometry won't blend with each other.

So you mean I should use the first rendered frame buffer contents as a texture input and other data as another texture input and blend them before the blending part in the pipeline using glTexEnv or something similar?

Thanks for bearing with me,

Andru

jwatte
06-28-2003, 03:07 PM
Correct.