Performance woes under linux w/1.4.2

Newbie alert :).

I’ve got JOGL working OK (most recent stable binary for linux, not CVS version AFAIAA) with 1.4.2 under linux, and started experimenting. Following the advice in the quick-start guide, I’ve got a GLCanvas up (apparently that is hardware-accelerated, no?) and am rendering OK from my GLEventListener.

However, when I gave it a simple task of rendering a heightfield with 350k triangles, it can only render less than 100k tris a second - on a GeForce2Go (just like Ge2, but with 16Mb RAM) + P3-1Ghz with no lightsources, nothing fancy - just smoothshaded tris in tristrips. The heightfield is almost square, and I’ve got a separate tri-strip per row.

I was sure performance should be better than that?

Curiously, reducing the size of the output window makes a big difference in performance, approximately linear relative to num pixels in window (I’ve got no code in the GL resize methods in my GLEventListener, so just allowing it to do whatever comes naturally).

FWIW, I started from Greg Pierce’s “Getting started with JOGL” and have just made minor modifications since…e.g. adding a one-time method to generate a heightfield.

Sounds like you’re horribly fill-rate limited. What does JoGL say about your drivers?

gl.glGetString(GL.GL_VENDOR);
gl.glGetString(GL.GL_RENDERER);
gl.glGetString(GL.GL_VERSION); 

My guess, you’ve got no hardware accelleration enabled and Mesa isn’t up to the task.

Thanks. I get:

[INFO] NVIDIA Corporation
[INFO] GeForce2 MX/AGP/SSE
[INFO] 1.3.1 NVIDIA 31.23

Does this mean HW accelerated (I’m guessing I wouldn’t be seeing GF mentioned if it was SW :))?

I’m using Nvidia’s linux drivers (since XFree86 drivers don’t work - they never got around to writing drivers for all GeForce cards… although this may have changed in the last 6 months or so)

AFAIAA I have certainly had HW acceleration working on this machine - I played AF on it a long time back, as well as some non-java OGL games.

Is there a benchmark for OGL performance on Linux? I know of the SPEC one, but that’s around 100Mb download and I don’t have broadband access for at least the next few weeks…

PS those three calls all returned null when I put them in GLEventListener’s init method, I had to put them in display to get anything. Is that the way it’s supposed to be?

[quote]PS those three calls all returned null when I put them in GLEventListener’s init method, I had to put them in display to get anything. Is that the way it’s supposed to be?
[/quote]
No, they should be working by that point. The only initial guess is to try requesting a really simple framebuffer, sometimes odd combinations of colour/depth/stencil can return a software framebuffer but with the nVidia vendor (might be specific to windows machines this behaviour though). In particular take out any alpha buffer requests, this consistantly fails on all linux boxes i’ve tried it on :frowning: