Color mixing

Hi

Does anyone know, how the AWT mixes two colors with alpha channel?

If I have two BufferedImages bi1 and bi2 both with an alpha channel and create a Graphics2D object on bi1 and draw bi2 on it with drawImage(), what is the way, Java mixes the two colors then?

If one of the images doesn’t have an alpha channel, it’s totally clear. But with both images having an alpha channel, there are different ways to go. But the way, the AWT mixes the colors, seems correct for all of my current use cases. And I would like to know, how it works.

Marvin

If there is no composite set, I’m fairly sure it simply does:
out{a,r,g,b} = {a1*a2, r1*r2, g1*g2, b1*b2}

Thanks for the reply mate. :slight_smile:

I am pretty sure, that this cannot work. The alpha channel of bi1 affects RGB of bi1 only. If a1 is something like 0.1 (assuming smaller means more transparent) and r1 is 1.0, then r1 would only influence the resulting color by 0.1.

Well, it might be wrong and certainly I am. But it looks strange to me.

Marvin

The default is AlphaComposite.SRC_OVER

dealing with color values between 0 - 1
wouldn’t it be something like:

bi1 before drawing bi2:
color = bi1.color * bi2.alpha;

after drawing bi2:
bi1 color influence = 1 - bi2.alpha
bi2 color influence = bi2.color * bi2.alpha

edit: oops, I just posted after ryanm. anyways his is the right answer

Thanks a million. This works almost perfect. Only my fonts are messed up now. The rest looks perfect.

Strange thing ist, that the above math leads to unnormalized color components (greater than 1.0). I have clamped them to 1.0 now. But my fonts are still messed up.

I render my fonts by drawing them on a separate buffered image and then merge pixels on the destination texture (certain reason for this).

Marvin

Out of interest, what are you doing the color mixing for? Direct manipulation of a BufferedImage’s pixel data?

Well, the BufferedImage is actually only a special case. I have created my own image implementation, that works on a ByteBuffer or byte array. For some cases, where I need Graphics2D drawing, I create a BufferedImage backed by my bytes. For the ByteBuffer case a BufferedImage is horribly slow. So I needed my own drawing and color mixing routines, that directly manibulate the bytes in the ByteBuffer or byte array. They worked great for years. Now I did something new and discovered weaknesses in my mixing code.

Marvin

Interesting, is it for openGL related stuff, which use ByteBuffers? I thought openGL has similar style methods/operations to java2D, plus more.

Yes, originally it was for Xith3D (OpenGL). Currently I use my TextureImage2D class in another project, that uses Direct3D. I only need the byte arrays there. So I don’t necessarily need my own drawing routines.

Thing is, I need to do some drawing offline and only send the result to the graphics card because Dierct3D has a very strange way to lock the texture buffer, which makes it extremely expensive. But I guess, I’m not telling you news ;).

Marvin

I found the solution. The math in this link is either wrong or assumes some kind of premultiplication. The correct math is this.

Ar = As + Ad * ( 1 - As )
Cr = Cs * As + Cd * Ad * ( 1 - As )

It looks perfect now.

Marvin

Cool, so you’re using Direct 3D with java? Is there a new binding for that? i didn’t know that one existed.

I had a look around last night for a pure-java software image API but couldn’t find anything that was meant for real-time. The only one I’ve heard of is in pulpcore (http://www.interactivepulp.com/pulpcore/).

Well, I wrote my own little C++ DLL. Not a binding though, it just does’ waht I need. I am working on an rFactor plugin called rfDynHUD to draw dynamic HUDs on top of rFactor. And therefore I need a little adapter DLL to invoke a JVM. Well, and rFactor uses Direct3D. So I had no choice :).

Thank you so much man.

Marvin