Xith Problems, JOGL and LWJGL

I have been working on a game recently using the Xith engine. However, I’m running into trouble with both JOGL and LWJGL, depending on what machines I try to run the game on.

When using JOGL on my machine, the performance is terrible. I’m using a while loop that calls view.RenderOnce, and even though that loop is being executed many times, the screen doesn’t update more than about once every second or so. However, LWJGL works fine.

The computer this is happening on is a PC running Windows XP, with a ATI Radeon 9200 SE.

One several other computers, using LWJGL, the game quits when view.renderOnce is called, giving the following error messages:

org.lwjgl.opengl.OpenGLException: Invalid operation (1282)
at org.lwjgl.opengl.Util.checkGLError(Util.java:56)
at org.lwjgl.opengl.Display.update(Display.java:468)
at com.xith3d.render.lwjgl.CanvasPeerImpl.renderDone(CanvasPeerImpl.java:798)
at com.xith3d.render.lwjgl.CanvasPeerImpl.display(CanvasPeerImpl.java:984)
at com.xith3d.render.lwjgl.CanvasPeerImpl.render(CanvasPeerImpl.java:1061)
at com.xith3d.scenegraph.View.renderOnce(View.java:591)
at com.xith3d.scenegraph.View.renderOnce(View.java:524)
at eleconics.ViewWindow.startView(ViewWindow.java:242)
at eleconics.ViewWindow.main(ViewWindow.java:28)
java.lang.Error: org.lwjgl.opengl.OpenGLException: Invalid operation (1282)
at com.xith3d.render.lwjgl.CanvasPeerImpl.display(CanvasPeerImpl.java:1013)
at com.xith3d.render.lwjgl.CanvasPeerImpl.render(CanvasPeerImpl.java:1061)
at com.xith3d.scenegraph.View.renderOnce(View.java:591)
at com.xith3d.scenegraph.View.renderOnce(View.java:524)
at eleconics.ViewWindow.startView(ViewWindow.java:242)
at eleconics.ViewWindow.main(ViewWindow.java:28)
Caused by: org.lwjgl.opengl.OpenGLException: Invalid operation (1282)
at org.lwjgl.opengl.Util.checkGLError(Util.java:56)
at org.lwjgl.opengl.Display.update(Display.java:468)
at com.xith3d.render.lwjgl.CanvasPeerImpl.renderDone(CanvasPeerImpl.java:798)
at com.xith3d.render.lwjgl.CanvasPeerImpl.display(CanvasPeerImpl.java:984)
… 5 more
Exception in thread “main”

and

org.lwjgl.opengl.OpenGLException: Invalid enum (1280)
at org.lwjgl.opengl.Util.checkGLError(Util.java:56)
at org.lwjgl.opengl.Display.update(Display.java:468)
at com.xith3d.render.lwjgl.CanvasPeerImpl.renderDone(CanvasPeerImpl.java:798)
at com.xith3d.render.lwjgl.CanvasPeerImpl.display(CanvasPeerImpl.java:984)
at com.xith3d.render.lwjgl.CanvasPeerImpl.render(CanvasPeerImpl.java:1061)
at com.xith3d.scenegraph.View.renderOnce(View.java:591)
at com.xith3d.scenegraph.View.renderOnce(View.java:524)
at eleconics.ViewWindow.startView(ViewWindow.java:242)
at eleconics.ViewWindow.main(ViewWindow.java:28)
Exception in thread “main” java.lang.Error: org.lwjgl.opengl.OpenGLException: Invalid enum (1280)
at com.xith3d.render.lwjgl.CanvasPeerImpl.display(CanvasPeerImpl.java:1013)
at com.xith3d.render.lwjgl.CanvasPeerImpl.render(CanvasPeerImpl.java:1061)
at com.xith3d.scenegraph.View.renderOnce(View.java:591)
at com.xith3d.scenegraph.View.renderOnce(View.java:524)
at eleconics.ViewWindow.startView(ViewWindow.java:242)
at eleconics.ViewWindow.main(ViewWindow.java:28)
Caused by: org.lwjgl.opengl.OpenGLException: Invalid enum (1280)
at org.lwjgl.opengl.Util.checkGLError(Util.java:56)
at org.lwjgl.opengl.Display.update(Display.java:468)
at com.xith3d.render.lwjgl.CanvasPeerImpl.renderDone(CanvasPeerImpl.java:798)
at com.xith3d.render.lwjgl.CanvasPeerImpl.display(CanvasPeerImpl.java:984)
… 5 more

The first set of errors listed were generated on a PC with a GeForce2 MX.

The second set were from a Mac with the following specs:

Hardware Overview:

Machine Model: PowerBook G4 12"
CPU Type: PowerPC G4 (3.3)
Number Of CPUs: 1
CPU Speed: 867 MHz
L2 Cache (per CPU): 256 KB
Memory: 640 MB
Bus Speed: 133 MHz
Boot ROM Version: 4.5.5f4

GeForce4 MX:

Type: display
Bus: AGP
VRAM (Total): 32 MB
Vendor: nVIDIA (0x10de)
Device ID: 0x0179
Revision ID: 0x00a5
ROM Revision: 2030

The same error was generated on a PC with the following graphics specs:

SiS M650, 32MB ram, integrated chipset model.
PC: 3.07 ghz, 480 MB ram

The game, when compiled with JOGL, runs fine on all three of those machines.

Finally, the game runs fine in both LWJGL and JOGL on a PC with an ATI Radeon 9800

In all of these cases, the LWJGL and JOGL demos that are provided on the respective sites, running outside of Xith, work fine.

Any suggestions as to how these problems, or even hints as to what might be causing them, would be greatly appreciated.

LWJGL’s Display.update automagically checked glGetError (which you should be doing every frame) and throws that exception. A properly structured OpenGL app should never trigger that exception, so something in either Xith or your code needs to be changed.

Unfortunatly glGetError is a graphics card driver call, so the behaviour can be different between different graphics cards and different driver versions. You’ll need to narrow it down to the single OpenGL call thats triggering the error (copious use of LWJGL’s Util.checkGLError should nail it down pretty quick if you can test on the machine in question). Then you can easily look it up in the docs and it should show you what the root cause it.

“Invalid operation” is fairly generic, I usually only trigger it when I do silly things like binding textures inside of glBegin/end. The problem should be fairly obvious when you find out what call is the cause.

Can you try running with the most recent build (1.1 b10) of JOGL? I think that will clear up the performance issues with your ATI Radeon, which are probably caused by multithreading problems in ATI’s drivers. 1.1 b10 forces all OpenGL work to be done on a single thread and is much more compatible on a wide range of OSs and graphics cards.

You can probably track down the bug in your OpenGL code by installing JOGL’s DebugGL pipeline. I don’t know if Xith3D provides access to the GLCanvas you’re rendering into, but if it does you can install the DebugGL pipeline when you create the canvas:


canvas.setGL(new DebugGL(canvas.getGL()));

[quote]Can you try running with the most recent build (1.1 b10) of JOGL?
[/quote]
i have mobility radeon 7500 ; this version (1.1b10) didnt really helps. but i remember all(xith demos) works very good before 1 year , i will check it with proper CVS soon

After doing some more debugging (similiar measures to those suggested by Orangy Tang), I have found the following.

Using LWJGL, the problem occurs in the TextureShaderPeer implementation, in the function “shade”.

in the following block:

            switch (((TextureShader) shader).getUnit()) {
                  case 0:
                        ARBMultitexture.glActiveTextureARB(ARBMultitexture.GL_TEXTURE0_ARB);
                        break;
                  case 1:
                        ARBMultitexture.glActiveTextureARB(ARBMultitexture.GL_TEXTURE1_ARB);
                        break;
                  case 2:
                        ARBMultitexture.glActiveTextureARB(ARBMultitexture.GL_TEXTURE2_ARB);
                        break;
                  case 3:
                        ARBMultitexture.glActiveTextureARB(ARBMultitexture.GL_TEXTURE3_ARB);
                        break;
                  case 4:
                        ARBMultitexture.glActiveTextureARB(ARBMultitexture.GL_TEXTURE4_ARB);
                        break;
            }

the invalidEnum error is thrown when calling any of the cases 2-4. This appears to be the correct behavior for glActiveTextureARB, according to online documentation, when the enum passed in is greater than GL_MAX_TEXTURE_UNITS_ARB.

However, when told to simply default to one of the ealier cases, only a white screen is displayed.

Can anyone illumanate what the purpose of this function is, or a good way to limit it to the correct range of texture units?

glActiveTextureARB sets, surprisingly enough, the currently active texture unit for multitexturing. Depending on the card you’ll have anywhere between 1 and 16 multi-texturing units (simultaneous textures applied to one poly).

A properly created app will use glGet(GL_MAX_TEXTURE_UNITS) (i think thats the correct enum) and never attempt to use anything above that. It sounds like Xith is either not checking (and using some kind of fancy multitexture effect regardless) or the value is being reported wrong.

If possible, try calling glGet and finding out the reported values. Then you can look up what it should be on Delphi3d.net and see if its being reported right. If its right then its a Xith problem, if not its a LWJGL problem.