[SOLVED] Alpha blending causing transparency through window (LWJGL)

I’m having a strange problem with alpha in LWJGL. I’m drawing 2D graphics (so no depth-testing going on) which have transparency. Wherever a translucent graphic is drawn onto the screen, it seems to be overriding the alpha of those pixels–so if I draw a translucent graphic over a fully opaque background, it causes the resulting image to be transparent despite the background. This is ultimately causing my game window to have transparent holes in it that show through to whatever window is behind the game.

Here’s a partial screenshot. You can see the text from the IDE showing through spots on the screen where some translucent shadow graphics were drawn.
Imgur

Here’s the relevant code. initGl() is called once during setup. update() is called in a loop to draw the screen. drawTexture() is just a helper function.
http://pastebin.java-gaming.org/4b71c810c98

side note: the Texture class referenced is not from Slick-Util, it’s a class I made myself, thus using image width and height of 1 instead of using getWidth() and getHeight() within the drawTexture function

I’m not new to Java but I am rather new to LWJGL and OpenGL in general. Any help is appreciated, I’ve been trying to figure this out for the past couple days without any luck.

Uh, that’s an amazing effect! ;D
I have seriously never seen that before…

The only thing that comes to my mind right now might be to try to set the clear color:


GL11.glClearColor(1f, 1f, 1f, 1f);

This should set the clearing color to a white, opaque color. Maybe it uses 0, 0, 0, 0 as clear color by default, which would be transparent black :slight_smile:

I have tried setting the clear color to fully opaque using glClearColor(), however unfortunately that had no effect on the problem. Thanks for the suggestion though :slight_smile:

Here’s my understanding of the problem so far (and keep in mind I’m new to OpenGL):

The screen alpha channel is set to opaque when the fully opaque background is drawn (the first layer of textured quads that are drawn). When a translucent pixel is drawn over that (the problem doesn’t occur with fully transparent pixels, only the translucent ones) it causes the alpha channel to fall from being fully opaque to being translucent. Then the window manager (I’m running with Gnome 3) sees these translucent pixels on the resulting screen image and respects them, causing the window to be transparent and show the windows behind it.

I think the reason the opaque background is turning translucent has to do with how the alpha blending is occurring. I am using the settings GL_SRC_APLHA for source and GL_ONE_MINUS_SRC_ALPHA for destination.

Referring to the OpenGL documentation at https://www.opengl.org/sdk/docs/man/html/glBlendFunc.xhtml, this means that the resulting pixel value after blending a translucent graphic onto an opaque one has for its alpha value (using floats between 0 and 1):

(1 - sourceAlpha) * destAlpha + sourceAlpha * sourceAlpha

Which by the math means that blending a translucent graphic onto an opaque graphic will never produce an opaque result. Either I am misunderstanding the math behind this blend mode and the problem is elsewhere, or this is indeed how this blending mode works, in which case the question is a matter of which blend mode I should be using instead to avoid this problem.

Hopefully this more detailed information can spark some ideas of what’s going wrong or what I’m misunderstanding about the OpenGL system. Once again, any help is appreciated.

I managed to solve the problem ;D For those interested:

To test my hypothesis (above), I ran the program in Windows. The problem went away. This means that the problem was coming from the display having an alpha channel, causing it to have holes. In Windows, alpha channels don’t seem to be supported for the window, at least not by default. In Linux or specifically in Gnome 3, alpha channels are supported by default.

This means that when the display was created, an alpha channel was given to it. This channel was unwanted. To prevent getting the alpha channel, one line of code had to be changed in the GL initialization:


Display.create(new PixelFormat(0, 0, 0));

This just asks for a display with no alpha buffer (that’s the first argument to PixelFormat, the other ones are for depth and stencil, which are irrelevant to this problem). Without an alpha buffer, the display ignores the alpha value of the blending result, and thus the entire problem is avoided.