mixing RGB colours

my first post!

does anyone know of a function to mix 2 RGB(0,0,0) colours togther in c++ and output it as a single RGB(0,0,0) colour , and if this is possible is it done by the GPU or the CPU ??

Hi,

well, of course… but if you don’t know how to step through an array and perform a weighted (or otherwise) average of two numbers, then you probably want to read up on books on programming, first…

blending is entirely possible to do under OpenGL.

cheers
John

thanks for that , im happy its very possible , ive been doing a couple of turorials and getting help from a friend with c++ for 2 years , is there spefic function when using opengl to mix the colours ?

Hi,

yes… the blend function :wink:

http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=08

http://www.essi.fr/~buffa/cours/synthese_image/DOCS/Tutoriaux/Nehe/lesson8.htm

and so on

cheers
John

Hi,

btw, here is a code fragment that mixes two colours together;

  
{
  const float bias=0.5;
  float colour1[3], colour2[3], result[3];
  
  for(unsigned int t=0; t<3; t++)
    result[t]=bias*colour1[t]+(1.0-bias)*colour2[t];
  // ...
}

this fragment mixes two colours and stores the result in result[]. bias determines how much of the result comes from the first colour: hence, you can mix red and green together and get blends from red (bias=1.0) to green (bias=0.0), and orange (bias=0.5)

good luck with your programming studies!

cheers
John

thank you so much that helped me a lot im nearly finished , is the blending you use there calucate by the CPU or the GPU?

That’s calculated by the CPU (unless it’s run as a fragment program).

The only way to get the GPU to do something is to exploit procedures that are hardware excelerated by graphics hardware.

This falls under GPGPU - you can Google it.

Most OpenGL calls are mapped to the graphics hardware and executed by the GPU. If they are not supported by the underlying hardware, they fall back to software and suffer a pretty heavy blow to performance - this is undesirable.