line antialiasing question

the program that created the attached image utilizes OpenGL’s built in antialiasing via:

gl.glEnable( GL.GL_LINE_SMOOTH );
gl.glHint( GL.GL_LINE_SMOOTH_HINT, GL.GL_NICEST );

however, it doesn’t seem to recognize calls to gl.glLineWidth( float lineWidth ) – all lines are the same size.

in the red book it mentions that this is just a simple calculation of coverage. am i required to implement my own antialiasing routine in order to get better results?

I’m not sure how many consumer cards implement antialiased lines, at least with varying width. I think this is one of the differences between consumer-grade and professional-grade cards.

You might want to consider using full-scene antialiasing instead, which is much more widely supported. Take a look at the source code for demos.multisample.Multisample in the jogl-demos workspace.

awesome thanks

first, ken thank you for pointing me in the right direction.

after experimenting with the settings and performance of the built-in opengl anti aliasing, it’s pretty darn good. however, my application will be working with cubic bezier curves mostly imported from adobe illustrator. i would like to achieve the anti aliasing quality of adobe illustrator, even at the expense of some performance. i have searched in vain for any under-the-hood discussions of the techniques used by illustrator ( obviously 2D ).

please see attachment.

what makes illustrator’s aa so very good? it doesn’t appear to be stochastic. opengl’s is nice, but even with sixteen samples there are a lot of problems where the lines converge. am i destined to implement my own scheme to achieve similar results?

i have read through the aa sections in Advanced Graphics Programming in OpenGL, Realtime Rendering, and others and they all seem to cover the same territory and techniques. anyone know what makes illustrator’s aa so nice? even the name of suggested technique would be a big help.

cheers

I’m not an expert in antialiasing techniques but this article may give you some ideas. I found this with a Google search for “antialiased lines opengl shaders”. You might find more help on the forums on opengl.org.

I’m using line antialiasing and variable widths without a problem here, and I’m running a regular GF 6600 via LWJGL. I find that line-aa tends to give better results for thin geometry over full screen aa, even at high sampling (like the 16 you posted). What kind of hardware were you trying it on? I’d guess that you have a bug in your rendering somewhere - perhaps blending got accidentally switched off?

i checked my code and the aa seems to be set up correctly:

gl.glEnable( GL.GL_LINE_SMOOTH );
gl.glEnable( GL.GL_BLEND );
gl.glBlendFunc( GL.GL_SRC_ALPHA, GL.GL_ONE_MINUS_SRC_ALPHA );
gl.glLineWidth( 1.0f );
gl.glClearColor( 0.0f, 0.0f, 0.0f, 0.0f );

from the console:

GL.GL_LINE_WIDTH_GRANULARITY value is 0.125
GL.GL_LINE_WIDTH_RANGE values are 0.5, 10.0

it definitely works. my card is geforce 6600. i was able to get the code to recognize the line width value finally. i think i didn’t set up the blend correctly earlier. it’s fixed now.

i think it’s more of an aesthetic issue. i’m going to see what’s out there in terms of high-end, low-performance aa opengl schemes and share what i find.

thanks ken and orangy