Transition from PBuffers to FBOs breaks shaders [solved]

Just as the title implies, I’ve transitioned from PBuffers to FBOs in my engine.

I can draw to a FBO and render to a texture just fine. Drawing the texture is no problem either. I get problems however, when I try to send the FBO’s texture ID to a shader via uniform variable. The shader is receiving the texture ID as a sampler2D. The result is blank pixels.

If I mess with the texture format of the FBO, I am able to get the shader to spit out pixels somewhat representative of what it should. For example, if I draw a blank green screen to the FBO, a piece of the shader’s ouput will be green. I can change the color and the end result will follow suit. This tells me that the texture format the shader expects is different than what’s created by the FBO. Unfortunatly, this is the point where things are over my head.

Anybody care to share their FBO implementation who’s resulting texture can be correctly fed into a sampler2D?

FYI: Radeon 9800, Driver Date 3rd of October 2005, WinXPsp2

I have no such problems with FBO. If no errors are raised during FBO setup and rendering, you should be able to use the result as a normal texture.

What you’re describing sounds like bad texture coordinates used when rendering to the FBO or when using the texture. So, first make sure everything is fine wrt texcoords (e.g. bind another texture that you’re sure is ok). If that doesn’t work, try newer drivers, force an RGBA format (most supported) or try setting GL_NEAREST min/mag filters (GL_LINEAR is problematic sometimes with FBO and certain texture formats).

Thanks for the suggestion. That’s odd how GL_NEAREST can mess things up. I’ll have to test things when I get home. Also, I’ve noticed that the OpenGL specification sets up the texture like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_RGB, GL_INT, NULL);

The difference from LWJGL is the NULL final parameter. LWJGL of course, doesn’t like this, so I do the following:

ByteBuffer buffer = ByteBuffer.allocateDirect(width * height * 4);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

Could this perhaps be the issue?

[quote=“Funkapotamus,post:3,topic:26360”]
Yeah, some GPUs don’t like GL_LINEAR, especially with fp texture formats. Anyway, that’s probably not your problem, it would result in an incomplete FBO status.

[quote=“Funkapotamus,post:3,topic:26360”]
You actually can pass a null parameter. Try this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, (ByteBuffer)null);

GL_LINEAR->GL_NEAREST has no effect. Updating drivers seems to cause more problems than good- the entire display slows down on a whole (Even in windows! Wierd!)

I’ll just post my code to see if there’s anything wrong:

FBO Initialize

EXTFramebufferObject.glGenFramebuffersEXT(fbID);
EXTFramebufferObject.glGenRenderbuffersEXT(rbID);
EXTFramebufferObject.glBindRenderbufferEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, fbID.get(0));
EXTFramebufferObject.glRenderbufferStorageEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, GL11.GL_DEPTH_COMPONENT, width, height);

GL11.glGenTextures(intBuffer);
textureID = intBuffer.get(0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureID);
GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA, width, height, 0, GL11.GL_RGBA, GL11.GL_UNSIGNED_BYTE, (ByteBuffer)null);
GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_NEAREST );
GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_NEAREST );

Enable FBO

EXTFramebufferObject.glBindFramebufferEXT(EXTFramebufferObject.GL_FRAMEBUFFER_EXT, fbID.get(0));
EXTFramebufferObject.glFramebufferTexture2DEXT(EXTFramebufferObject.GL_FRAMEBUFFER_EXT, EXTFramebufferObject.GL_COLOR_ATTACHMENT0_EXT, GL11.GL_TEXTURE_2D, textureID, 0 );
EXTFramebufferObject.glFramebufferRenderbufferEXT(EXTFramebufferObject.GL_FRAMEBUFFER_EXT, EXTFramebufferObject.GL_DEPTH_ATTACHMENT_EXT, EXTFramebufferObject.GL_RENDERBUFFER_EXT, rbID.get(0) );

Pseudocode of what I Do

enableFBO();
// Draw Some Random Scene
disableFBO();

enableShader(); // Set the shader's sampler2D to the FBO's texture.
// Draw Quad
disableShader();

Like I said, if I skipt he shader part and instead bind the FBO’s texture via glBindTexture() the quad is drawn fine with the random scene’s result. Ugh.

The FBO code seems fine. Could you please post the shader code? The shader setup and draw code could help too.

Sure thing.

Shader Enable

ARBShaderObjects.glUseProgramObjectARB(programID);
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, Engine.fbo.getTextureID()); // Quick and dirty- fbo is static in the engine.
GL11.glEnable(GL11.GL_TEXTURE_2D);
ARBShaderObjects.glUniform1iARB(texture1, 0); //texture1 = getUniformLocation(programID, "tex");
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glDisable(GL11.GL_TEXTURE_2D);

Fragment Shader


uniform sampler2D tex;

void main()
{
vec2 verts = gl_TexCoord[0].xy;
vec4 color = texture2D(tex, verts);
color += color;
gl_FragColor = color;
}

Vertex Shader


void main()
{
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = ftransform();
}

Drawing


fbo.enable();
	float width = 512.0f;
        float height = 512.0f;
        GL11.glBindTexture(GL11.GL_TEXTURE_2D, RANDOMTEXTUREID);  // Texture ID of some picture    	
	GL11.glBegin(GL11.GL_QUADS);
		GL11.glVertex3f(-width/2, -height/2, 0.0f);     GL11.glTexCoord2f(0.0f, 0.0f);
		GL11.glVertex3f(width/2, -height/2, 0.0f);	GL11.glTexCoord2f(1.0f, 0.0f);
		GL11.glVertex3f(width/2, height/2, 0.0f);	GL11.glTexCoord2f(1.0f, 1.0f);
		GL11.glVertex3f(-width/2, height/2, 0.0f);	GL11.glTexCoord2f(0.0f, 1.0f);   	
	GL11.glEnd();
fbo.disable();

// Note: Enabling the fbo texture here does nothing
shader.enable();
// Note: Enabling here does nothing either
	 GL11.glBegin(GL11.GL_QUADS);
		GL11.glVertex3f(-width/2, -height/2, 0.0f);     GL11.glTexCoord2f(0.0f, 0.0f);
		GL11.glVertex3f(width/2, -height/2, 0.0f);	GL11.glTexCoord2f(1.0f, 0.0f);
		GL11.glVertex3f(width/2, height/2, 0.0f);	GL11.glTexCoord2f(1.0f, 1.0f);
		GL11.glVertex3f(-width/2, height/2, 0.0f);	GL11.glTexCoord2f(0.0f, 1.0f);   	
	GL11.glEnd();
shader.disable();

I think that’s everything. I simplified parts that would (i should hope not!) have no effect on things(translates, rotates). Thanks for your help on this by the way.

Using updated drivers now - same effect :confused:

Everything looks fine again. With the above drawing code and if no GL errors are raised, you should have seen the contents of the RANDOMTEXTUREID texture. If not, then you’re doing something wrong with the shader and the problem has nothing to do with FBO. If you can see RANDOMTEXTUREID, but not (the same result) when enabling the FBO texture, then there’s something wrong with FBO indeed.

Other notes:

- You don't need to bind the texture when you're setting the sampler2D uniform (in the Shader Enable code).
- Have you tried binding any other texture? Does it get drawn with the shader enabled?
- Have you tried outputting a debug color in the shader? Does it get drawn correctly?

Hehe, just noticed that you’re calling glTexCoord2f after glVertex3f! In immediate mode, glVertex3f triggers the vertex submission to the pipeline. So, basically, each vertex is using completely wrong texcoords!

Well, I solved the issue, but you’ll never guess what it was…

I was disabling GL_TEXTURE_2D in a line of code I overlooked. :o
All I had to do was get rid of that line… everything functions normally. Silly how I can be dealing with shaders and FBOs and all these other things… but the small stuff still catches me off guard.

Yep. I’m a genious. :-\ ::slight_smile:

Side note: I still use glVertex3f and glTexCoord2f- doesn’t seem to mess things up.
Sorry to waste your time Spasi, I appreciate your help. Good work on the FBO context too, this will really speed things up from PBuffers.