Anti-aliasing in JOGL

I was looking closely at some of my rendering and even though I think I have enabled all the right things, I am not seeing any anti-aliasing of my filled polygons. I have

	gl.glEnable(GL.GL_LINE_SMOOTH);		
	gl.glEnable(GL.GL_POLYGON_SMOOTH);
	gl.glHint(GL.GL_POLYGON_SMOOTH_HINT, GL.GL_NICEST);

But this does nothing. Does one have to descend down in multi-sample pixel twiddling and using all those proprietary extensions? I am spelunking around and doing more reading, but thought I would post this question as well.

TIA

Those commands enable edge anti-aliasing, which IIRC requires you to set up blending to work. You also need something like:

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

And since you’re using blending you need to sort your geometry for proper results.

There might be some helpful hints here…

http://www.java-gaming.org/index.php/topic,18119.msg142271.html

I think you also have to enable multisampling if it’s supported:

if (gl.isExtensionAvailable(“GL_ARB_multisample”)) {
gl.glEnable(GL_MULTISAMPLE);
}

Ah, so simple. I hadn’t enabled the sampleBuffers in glCapabilities. RTFM! :wink:

Now it works fine. Seems like anti-aliasing should be on by default, but perhaps there is a good reason not to enable by default.

That’s full-screen antialiasing, which is entirely different from the edge anti-aliasing commands you posted originally. You almost certainly don’t want both enabled at the same time, pick one and go with that.

Full screen anti-aliasing is more robust (and doesn’t require sorting) and usually gives higher quality, but it’ll be slower and available on less hardware.

;D

Understood. This has been quite interesting. Turns out as well that the driver for my card was old. The card is nothing special, an ATI X1900. But the driver is what came with the machine rather than an update from ATI. I installed the update from ATI and it came with a lot of fine-grained choices. Among them was whether or not to allow applications to control anti-aliasing. As expected, with per-app AA disabled, it doesn’t matter what the glCapabilities is set to. With it per-app AA enabled, it does.

in either case, performance was the same no matter what settings I used, which probably only indicates that 3D rendering is not gating it. Profiling indicates that the GLU Tessellator is the gating factor, which makes sense to me given the complex 3D “stroking” that I am doing. It would be nice if the GL library offloaded tessellation to the GPU. Does anybody know if this can be done, perhaps some extension that I don’t know about?

Thanks to everybody for the useful info.