View Full Version : blending
I am writing a code for volume rendering. It texuremaps all the slices and blen them. It works fine on the GeForce4 but the code is not working on Geforce2 and FX 5200.
Does anyone know why? Is this because of the hardware difference?
// blend setting for transparency
glBlendColorEXT(1.f, 1.f, 1.f, 1.f/50);
Sorry. I didn't state what the problem is with GeFouce2 and FX 5200:
The problem is that the blended image shows green colors in the middle where the brain is and show pattens within the image ( little rectangles all over the blended image).
The final image should resemble an xray image.
[This message has been edited by Jay2 (edited 09-25-2003).]
09-26-2003, 04:32 AM
Could be an RGB565 and/or dithering artifact.
Make sure you're running in true color and not in a RGB565 highcolor format for both your screen resolutions (display control panel!) and your texture downloads (use GL_RGBA8 instead of GL_RGBA in glTexImage!).
Thanks for the reply, Relic.
I've switched three cards on a same computer and I've run the same code. The result is, as I said, that Gefource2 and Fx 5200 shows the artifects but Gefource4 shows very nice blended image. Why is that? Hardware difference?
The problem was the screen color depth as you mentioned. When I changed the card, the setting changed to 16bits. After changing 32bit, it looks OK.
09-09-2009, 03:53 AM
please can you tell me what opengl extension are you using that contains those methods glBlendEquation, glBlendColorExt . as iam facing the same problem of having an x-ray effect. i have used this line of code time ago while i was developing under Delphi using the DGL libs.
Now i am using C++ and GLEW but i don't know the alternative for them in the GLEW
Kindly if you can help me or any one else please reply to my Replay and your help will be appreciated.
My VGA Card is Quadro FX 4600.
Powered by vBulletin® Version 4.2.3 Copyright © 2017 vBulletin Solutions, Inc. All rights reserved.