We were running a test of our renderer on Linux, 1.5 GHz amd, GeForce FX 5600/AGP/SSE/3DNOW!
The test is running an appalling 14 fps, while it runs at 110 fps on my windows machine.
Any ideas? This is using JDK 1.4.2 and the latest JOGL binaries.
We were running a test of our renderer on Linux, 1.5 GHz amd, GeForce FX 5600/AGP/SSE/3DNOW!
The test is running an appalling 14 fps, while it runs at 110 fps on my windows machine.
Any ideas? This is using JDK 1.4.2 and the latest JOGL binaries.
14 fps compared to 110… looks like hardware acceleration is disabled…
To be serious: I don’t know the answer unfortunately.
However I’ve tried to disable hardware acceleration on Win32 Jogl by setting GLCapabilities.setHardwareAccelerated(true), however with no effect.
Does anybody know if it’s possible to disable it per OpenGL? Would be nice for tests.
This is just a silly thought, but are you sure you have the NVidia accelerated drivers loaded and not the default “nv” driver?
Edit: BTW, it helps to know the version and distro of “Linux” you’re using. The Linux guys like to keep things confusing. :-/
On a hunch I changed the demo from using quads to using triangles. The frame rate jumped to 77 fps, which while not great, is better.
Ok so now why is it running at 77 instead of 100+?
It will be impossible to tell without knowing more about how your linux distro is setup and whether or not you’ve downloaded the Linux drivers from nVidia and installed them.
I haven’t tried JOGL on Linux yet but…
I did set up a box to test my previous Java3D stuff, and it did take a bit of fiddling with the config to proper 3D acceleration, nVidia card.
However, once I did, my apps ran comparably.
I would recommend testing some other C OGL demos to make sure is it JOGL is the issue.
Good luck
Therein why I switched from Windows to OSX instead of Windows to Linux. Found myself wasting entirely too much time trying to figure out what state the OS was in
Try a non JoGL, but still OpenGL, app. If you play TuxRacer and get a lovely 5fps, then you definitely didn’t install the nvidia drivers properly.
Just an FYI, it was my Linux box on which David had me run his test. My system is running SuSE Linux 8.2 professional with the latest nvidia drivers installed and 3D acceleration enabled (man, nvidia has come a long way in making it’s Linux drivers easy to install!). As David pointed out, changing from quads to triagles made the framerate jump from 14fps to 77fps. It seems the reason the framerate doesn’t match Davids 110 could be something in the Linux jogl, or perhaps nvidia’s Linux drivers are a bit behind in regard to the FX series cards (wouldn’t surprise me). Or perhaps the Geforce FX 5600 is not all it’s cracked up to be. I am able to run America’s Army perfectly fine and Unreal Tournament 2003 is playable, but not great at around 30fps.
Can you dump your XFree86 config file here? I would like to see if the proper drivers are getting loaded. (You should get WAY better frame rates than you’re talking. Better than Windows even.)
30FPS? That’s very odd. On my RedHat 9 box on a 2200+ with a GeForce 5200 I was getting close to 60.
Ok, well I thought it might be a bit much to dump my whole XF86Config file here, so I posted a copy to this link:
http://magicosm.servebeer.com/XF86Config
One thing I can’t help but wonder is if my cpu is a problem. When I bought my AMD Athlon processor, it was billed as a 1.6 GHz cpu. However, when I look at my /proc/cpuinfo file, it says this:
cpu MHz : 1050.034
Anyhow, I don’t know if that would make a significant difference either way.
FYI, the JOGL Linux implementation runs the jogl-demos demonstrations at the same frame rate as the Windows version, or faster in some cases because the Linux drivers apparently don’t mandate sync to vertical retrace. The Linux port was tested on a PIII 750 MHz, 256 MB RAM, Red Hat 7.3, GeForce FX 5800 Ultra.
[quote]One thing I can’t help but wonder is if my cpu is a problem. When I bought my AMD Athlon processor, it was billed as a 1.6 GHz cpu. However, when I look at my /proc/cpuinfo file, it says this:
cpu MHz : 1050.034
[/quote]
Was is billed as “1.6 GHz” or as a “1600”? The current AMD chips tend to do more than the Intel chips clock-for-clock, so they adopted the new naming convention to either (a) not lose sales to consumers who didn’t understand the issue, or (b) try and fool the community into believing their chips were better (delete as appropiate). That said, I’d expect a “1600” AMD chip to actually be 1.2-1.4 GHz, so that’s probably not the issue… :-/
Well, I bought the CPU off of a pricewatch site, and they advertised it as a 1.6 GHz, but that may have just been that store interpreting the 1600.
Also, I thought I should point out that my AGP slot is a 4X, not an 8X; I’m not sure if that makes a big difference or not.
I found that the Geforce FX 5600 Ultra seems more comparable to the geforce4 TI 4200 on Tom’s Hardware, and mine isn’t even the Ultra ($200), mine is the $160 version: http://www6.tomshardware.com/graphic/20030311/geforcefx-5600-5200-03.html#unreal_tournament_2003
I’m not sure if this is enough to account for the difference btw David’s and my test results (77fps vs. 110fps). However, what this does tell me is that it might be best to stick with the TI series of cards unless you want to shell out for an FX 5800 or better. It seems nVidia claims that while the FX 5600 might not have the horsepower you may expect for it’s price, it can take advantage of new features that future games will be using.
:
Your file looks ok, if not a little hokey. My only possible suggestion is to comment out the DRI stuff. It may not be hurting anything, but NVidia doesn’y support DRI. BTW, are you using XFree version 4.3?
That XF86Config file was generated by SuSE’s SaX2, which overwrites it whenever you change something, so I’m not sure if editing it will work, but I can try. It is XFree version 4.3.