Tips for rendering directly to Image

Hi All,

I want to take a ‘snapshot’ of many small 3D sceens as ImageIcon’s to use on some buttons.
Any tips or ideas where to start? Can Jogl even do this? Or do I need to render to screen and then somehow screen-scape the image?

Cheers

Peter

You can use a pbuffer or a renderbuffer and then copy theframe buffer to a bufferedimage. You can take a look at GLJPanel toget an idea of how to do it. Itwill be slow so if you are going to do an animated button make it small.

I’m using GLCanvas normally, so this is a bit involved for my level of understading :slight_smile:

So can I use something like the following to create a pbuffer

    GLCapabilities offscreenCaps = new GLCapabilities();
    offscreenCaps.setDoubleBuffered(false);
    offscreenCaps.setPbufferRenderToTexture(true);
    GLPbuffer pbuffer = GLDrawableFactory.getFactory().createGLPbuffer(offscreenCaps, null, 128, 128, shareWith);

Then I need a GLContext from somewhere for ‘shareWith’. Is there some way I can access the context being used from my other displayed GLCanvases? Currently I only pass the Animator down to this class since originally I just had a few GLCanvases doing this job. I had a look at the demos, and the user guide and the source code for GLJPanel, but i’m really not sure what I should be doing here.

I think I need something like the following?

GLDrawable drawable = GLDrawableFactory.getFactory().getGLDrawable(this, capabilities, chooser);
GLContextImpl context = (GLContextImpl) drawable.createContext(null);

but then If i get this far, where’s my init and display method come into play? Am I going to end up rewriting GLJPanel?
and thats before I try to somehow extract the image from the pbuffer…

any more pointers or tips would be really appreciated. I’m happy to work through this, but I’m a bit lost at the moment.

Cheers

Peter

It isn’t nearly this complicated. You don’t need a shareWith for your GLPbuffer; it behaves from the point of view of your GLEventListener exactly the same as a GLCanvas, only the rendering results aren’t displayed on screen. You just need to manually call display() on your GLPbuffer to get it to do its rendering. Once that is done (at the end of your GLEventListener.display() method) use glReadPixels as the GLJPanel does to read back the frame buffer into a BufferedImage. I would strongly recommend against playing with bits like setPbufferRenderToTexture as they are not portable.

Great! seems to work fine apart from I had to flip the image at the end.

Thanks again.

Peter

BTW, the com.sun.opengl.util.Screenshot class can help do some of this work for you.

thanks.

One strange thing I noticed now. To test and debug this functionaility I was using the same GLEventListener twice, once with a standard GLCanvas approach, and secondly with a GLPbuffer. This was working fine as I mentioned, until I remove the GLCanvas rendering, then my pbuffer only has the glClear color.

    //without these two lines the pbuffer is just background
    canvas.addGLEventListener(offscreenCanvasListener );
    canvas.display();

    offscreenCanvas.addGLEventListener(offscreenCanvasListener );
    offscreenCanvas.display();

Cheers

Peter

problem solved, reshape doesn’t get called when using pbuffer, and in here I was calculating the new aspect ratio of the screen. Of course I can calculate this on the dimension of the pbuffer and then it works :slight_smile:

Cheers

Peter

I’ve noticed that now that I’m using this to render around 30 small (128x64) images, it seems quite slow (150ms per image). Although I’m caching the rendered images so it’s only a problem on the first generation. But would you expect offscreen PBuffers to be slower than normal screen rendering?

Cheers

Peter

No, not really, depending on your graphics card. On NVidia and ATI hardware I’ve seen no problems with speed of pbuffers.

Tracked it down to my needless recreation and destruction of the pbuffers, I re-use everything now, and the first call is 200ms and the rest around 30ms each :slight_smile:

Peter