The HWShadowmapsSimple and ProceduralTexturePhysics demos in the jogl-demos workspace use pbuffers.
I was going to take a look at using PBuffers or at least see what they are and their advantages, but I noticed that the Javadoc says it’s an experimental class, might be removed at any time and shouldn’t be used.
Any comments on both whether I should use it, and what I can use it for?
It’s been in its current form in the JOGL workspace for a pretty long time and works on all of the supported platforms, so I would go ahead and use it. It’s useful for performing hardware-accelerated off-screen rendering. Hopefully soon pbuffers will be superseded by the GL_EXT_framebuffer_object extension, but until support for that extension becomes widespread there is still a lot you can do with pbuffers.
So in the situation where I have clearly defined layers of drawing (one for static lines/filled-polygons, one for changing/moving filled polygons, another for moving labels/text (texture fonts)), I could render each layer to a pbuffer and only redraw that layer if something on it changed, and on a screen refresh (i’m using an FPS animator) I could merge the buffers together and flush it to the screen?
Of is there a better way?
It probably depends on your application. I would be pretty surprised if you needed to go to those lengths with any modern card. If it supports pbuffers it is probably recent enough that it would be faster just to push all of the polygons down every frame. Unless you can enable render-to-texture functionality for pbuffers in your app (JOGL only supports this in its pbuffer implementation on Windows, IIRC) and thereby avoid a glReadPixels call per frame then the readback speed will probably hurt you.
I recall that the Wurm Online guys (Markus_Persson for example) use pbuffers in their app but I’m not sure how and for what.
Oki dokie.
Thanks for the information.
I’ll leave things as they are for now, but I’ll have to do some performance analysis to see where running down performance.
Greg.
is it possible to render 2 images to 2 textures using jogl pbuffers, and use the resulting textures in my main glcanvas
how would i pass along these textures? the javadoc just says the pbuffer binds to its internal texture, and sadly, doesn’t say how i can access it thereafter ??? or am i just reading it wrong
Yes, this should be possible. You can create two subordinate pbuffers of a parent GLCanvas, assign each a texture ID generated in your main loop, and during updating of each of them use glCopyTexImage2D to update the image in the appropriate texture. Later you can bind and use those texture IDs in your parent GLCanvas’s display callback.
Render-to-texture functionality was never portable and JOGL’s attempt at supporting it automatically in the GLCapabilities is very limited and not portable. The new EXT_framebuffer_object extension is more powerful and general but there’s still limited driver support for it; JOGL’s pbuffer support currently works on more platforms and cards.
thank you kind sir
would the new extension be appearing in jogl in the future?
The framebuffer extension is already available to you if you have an nvidia card and you are using the 75 or 76 series of drivers. I don’t think ATI has made that extension open to the public yet.