Framebuffer viewport problem

Im having issues with the current framebuffer, what should be really easy to solve, but im breaking my head over it.
Currently my implementation of binding / unbinding looks like this:


    public void Bind(){
        EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, FBOId);
       
        glPushAttrib(GL_VIEWPORT_BIT);
        glPushAttrib(GL_COLOR_BUFFER_BIT);
        glViewport( 0, 0, viewportwidth, viewportheight);
        glMatrixMode(GL_PROJECTION);   
        glClearColor(0,0,0,0);
       
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    }
    
    public void Unbind(){         
        EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, 0);
        glPopAttrib();
        glPopAttrib();
    }

However the framebuffer does not scale to viewportwidth, viewportheight
Instead, the output is an oddly scaled version between the framebuffer dimensions and the normal screen dimensions.

Im trying to render an image to a buffer (512, 512), the screen is however (1200, 900)

so how do i set the framebuffer to 512, 512 instead of having it locked at 1200, 900?
Any help appriciated.

I know it has something to to with glOrtho, but when i change this, everything gets really messed up.

It’s possible things aren’t getting attached correctly.
I’m not too familiar with the way you’re binding the FBO (fixed pipeline ?).

Aside from that, my process is similar to what’s being done here:
http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-14-render-to-texture/

Basically you generate an FBO handle, set some parameters and attach some textures to it.
I don’t see where any textures are being attached, so maybe you’re doing it in a different place?

At the very least, this should help. Check if the FBO is getting created properly.


if(glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE){
      System.out.println("Why Do you HATE ME!?!");
}

There’s a couple error codes it will spit out if things didn’t go according to plan. Unfortunately I’m stuck in my 9-5 prison, so I’ll dig those out of my project once my captors feel I’ve kludged enough code for today.

Firstly, off topic. In OpenGL, any constant ending with _BIT are bitfields. This means that they can be bitwise or d together just like you do with COLOR_BUFFER_BIT and DEPTH_BUFFER_BIT.

So


glPushAttrib(GL_VIEWPORT_BIT);
glPushAttrib(GL_COLOR_BUFFER_BIT);
...
glPopAttrib();
glPopAttrib();

can become


glPushAttrib(GL_VIEWPORT_BIT | GL_COLOR_BUFFER_BIT);
...
glPopAttrib();

just a little pointer.

Now, on topic. You set the matrix mode to GL_PROJECTION but you don’t actually load the identity matrix or anything so whatever matrix was there before is still there. Just try adding glLoadIdentity() after setting the matrix mode.

Secondly, I wonder if you know what pushing GL_COLOR_BUFFER_BIT does? This page will clarify if you don’t http://www.opengl.org/sdk/docs/man2/ (it just seemed an odd thing to be doing there)

Lastly, make sure that viewportwidth and viewportheight are actually the width and height of the texture you have bound to the framebuffer. The number of times things like this are just leaving a line out of a constructor… and I expect glViewport would probably throw an error if you try to use 0 for the width or height.

Post lastly, +1 to @Redocdam. Make sure you have checked for OpenGL (including framebuffer completeness) errors

Thank you for the responses.

GL_COLOR_BUFFER_BIT seems strange indeed, but otherwise the default clearcolor will change to the clearcolor of the framebuffer.
This code is still a bit raw (dont touch what works haha), but thanks for the suggestion, it looks a lil better now.

glLoadIdentity() messes stuff up, i dont know how i can revert this state at Unbind().

As for the errors:


        int framebuffer = EXTFramebufferObject.glCheckFramebufferStatusEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT ); 
        switch ( framebuffer ) {
                case EXTFramebufferObject.GL_FRAMEBUFFER_COMPLETE_EXT:
                        break;
                case EXTFramebufferObject.GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT:
                        throw new RuntimeException( "FrameBuffer: " + FBOId
                                        + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT exception" );
                case EXTFramebufferObject.GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT:
                        throw new RuntimeException( "FrameBuffer: " + FBOId
                                        + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT exception" );
                case EXTFramebufferObject.GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT:
                        throw new RuntimeException( "FrameBuffer: " + FBOId
                                        + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT exception" );
                case EXTFramebufferObject.GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT:
                        throw new RuntimeException( "FrameBuffer: " + FBOId
                                        + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT exception" );
                case EXTFramebufferObject.GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT:
                        throw new RuntimeException( "FrameBuffer: " + FBOId
                                        + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT exception" );
                case EXTFramebufferObject.GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT:
                        throw new RuntimeException( "FrameBuffer: " + FBOId
                                        + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT exception" );
                default:
                        throw new RuntimeException( "Unexpected reply from glCheckFramebufferStatusEXT: " + framebuffer );
        }

I think i need to leave the extension for what it is and aim for opengl 3.0.
Ill rewrite the framebuffer to a cleaner version (from the example of Redocdam).

Only need to find out how to do


int depthRenBuff = EXTFramebufferObject.glGenRenderbuffersEXT();
            // Bind it so we can set it up
            EXTFramebufferObject.glBindRenderbufferEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, depthRenBuff);

the right way, but i bet there is enough info about it.

Thansk for the help guys :slight_smile:

I would never dissuade someone from rewriting something to be cleaner, but I have to warn you that the only difference with the core, as opposed to the extension, is that you call it with GL30.* rather than EXTFramebufferObject.*

As for reverting the matrix,



//Bind framebuffer
glMatrixMode(GL_PROJECTION);
glPushMatrix(); //Works like pushAttrib but with matrices
glLoadIdentity();

...

glPopMatrix();
//Unbind framebuffer