[LWJGL] Use 2 images to create shadows

Hi!
I am trying to create a simple “baked” model engine in lwjgl. I tried this idea in Game Maker, and it worked perfectly, but I cannot seem to replicate it in Java.

The idea:
-Have your normal scene draw as it would
-Draw a second scene to an image (this scene would draw your baked model, but at the same camera position)
-Overlay the second scenes image ontop of the normal scene (with a blendmode that multiplies the colors)

What I already have done in java:

What I want to have as a desired effect:

The first two pictures represent scene 1 and scene 2.

I need to figure out how to draw scene 2 ontop of scene 1, and blend its colors in.
To do this, I would need to figure out how to save scene 2 to a surface (image) of some kind. But then I still wouldn’t know how to overlay it, and use blend modes.

Any ideas?

You could render the first part, disable depth testing, and then draw the lighting. You couldn’t stop there unless you wanted to see light through walls though, that was just where I would start if I were you.

It’s probably more efficient to do this through shaders, as using blend modes requires 2 drawings.

Make a shader that multiplies the two texture colours together and then you just require one render call. Just make sure you bind the two textures properly.

The only problem, is that the 2 scenes are actually 2 different models as well as 2 different textures.

I could attempt to scrap the whole idea, and do lights through shaders… but I have no idea where to start with them ;x

why do you have two different scenes? You really should only have the base scene with no lighting, and then add in overlays with the green effects. If its baked in completely, like it’ll never change at all, just create overlays for each light, don’t over complicate it by rendering two scenes.

But you really should just use shaders

It looks like the models’ shapes are identical, so why couldn’t you put both textures onto one shape?

because then I would severely lose texture quality. Both UV mapped differently (1 uv mapped to support the maps normal texturing, and the other uv mapped to have the baked texture).

Use the following blend mode:


GL11.glBlend(GL11.GL_DST_COLOR, GL11.GL_ONE_MINUS_SRC_ALPHA);

I hope it works. Haven’t checked it myself.

You need a VBO that contains vertex position and two sets of texture coordinates. Texcoord A would be the UV map of your diffuse. Texcoord B would be the UV map of your baked global illumination / occlusion / etc. Then in a shader you would be blend them together as you see fit.

Using two texture units is explained here:

If you are not familiar with VBOs and vertex data you have quite a bit of reading ahead of you. The easiest would be to use LibGDX which handles VBO, shader compilation, vector math, etc.

In either case this doesn’t sound like an optimization. It sounds like your UV maps for the baked lighting will be pretty huge. You may as well just bake the lighting in your 3D program, and export a single set of textures with all the lighting baked in.

Or, another sane approach is to use something like SSAO or irradiance volumes:


http://codeflow.org/entries/2012/aug/25/webgl-deferred-irradiance-volumes/

to answer the OPs initial question, google: FBO render to texture opengl