Depth Buffer: Linux vs. Windows

Ok, I’m setting the Display in the following way:

gl = new GL(title, bpp, 1, 1, 1);

for fullscreen and

gl = new GL(title, x, y, width, height, bpp, 0, 0, 0);

Works great in windows. However, same machine booted into Redhat 9.0 has MAJOR depth issues. Basically any depth issue that can occur is occurring.

I don’t think I’m fully understanding how to use the alpha, depth and stencil parameters. For the life of me I don’t remember why I’m using 1,1,1 for fullscreen and 0,0,0 for windowed. I must have been told to do that at some point. I know there has been discussion on how to properly use this, but can someone take me through it one more time. How would I insure that the alpha, stencil and depth buffers are set up properly for any given card? And why would it work on windows but not linux for the same machine?

http://www.JavaGaming.org/cgi-bin/JGNetForums/YaBB.cgi?board=LWJGL;action=display;num=1054959018

Basicly, the three parameters are the minimum allowed buffer size for depth, stencil and alpha. Specifying 0 for any of them allows LWJGL to return a window with no support for that particular buffer. So if you need either of the buffers, set the value to at least 1.

  • elias

Ok, when I set them to 1,1,1 for windowed, I get:

Could not find a matching pixel format

in Linux.

BTW: thanks for the link, I knew I saw that thread somewhere, but was unable to locate it.

You needed a depth buffer only, right? So what if you pass 1 for depth and 0 for the others?

BTW, what gfx card do you have?

BTW2, you can check out available mode combinations with ‘glxinfo’, thereby checking if LWJGL is doing something wrong.

  • elias

Ok, don’t have access to the machine at the moment, but I’ll try that ASAP. It has a Geforce 1 in it. If switching to:

0,1,0 doesn’t work, I’ll check out the glxinfo for any wierdness.

Ok, when I set 1 to ONLY the depth buffer I get the error (in linux):

ALLVARLIG: Failed to create display due to java.lang.Exception: Mode not supported by hardware

Which is happening with some windows machines as well. See: http://www.java-gaming.org/cgi-bin/JGNetForums/YaBB.cgi?board=Announcements;action=display;num=1057864152;start=30

Therefore, I think that all my current errors of my app are leading back to the same culprit… not setting pixel,depth,alpha correctly. If 1 causes an error, what is my other choice?

Just try a more “reasonable” value for depth if you need a depth buffer - 16 is a good choice.

Cas :slight_smile:

With my limited knowledge of graphics cards: Will 16 cause any problems or does every modern card support 16? Does the same go for pixel and alpha?

alpha should be at most 8. And you probably don’t need it at all. The alpha bits you specify is not the number of alpha bits you can use in e.g. textures etc. but how many bits of alpha in the framebuffer. And you don’t need framebuffer alpha unless you do fancy multipass blending in it.

  • elias

ALLVARLIG: Failed to create display due to java.lang.Exception: Mode not supported by hardware

BTW, that “Mode not supported by hardware” exception does not look like it’s from linux lwjgl. Are you sure you gave the right error?

You’re right, that’s from the windows version. I didn’t have the error at hand, I just knew it was something similar. If I remember right it was something about bad display mode.

Ok, after further conversation with the person who is testing the linux version for me, it turns out the error is actually caught by my application because no valid DisplayMode is found.

Write once, run anywhere. my ass.

I’d imagine the “write once, run anywhere” idea goes out of the window as soon as you start using native libraries (like LWJGL). :stuck_out_tongue:

Kev

You can write once, run anywhere, but you have to understand that you can do it wrongly! The problem you’ve got here is video drivers being a bit underdeveloped.

Cas :slight_smile:

I’d say underdeveloped X. He’s probably running into the cannot-switch-colordepth-or-frequency-under-X problem.

  • elias

Can you explain XFree86 and X to me in about 4 sentences, and where we fit in?

Cas :slight_smile:

I could try:

XFree86 is a free X server an ancient standard common on *nix. An X server is what the win32 GUI is to you win32 guys, controlling the mouse, keyboard and most importantly, the graphics card.

When we want to show an opengl window and fetch input from a message loop, I have to ask the X server for it. An OpenGL driver has to speak with the X server and that’s why I’m unable to switch color depth and frequency on the fly. The X mode switch extention simply does not supply API entries for it.

BTW, the window mode switching is even an extention to the X standard similar to NV_* extentions in GL.

  • elias

The ancient nature of X is the big problem. It seems that it was designed to support the concept of an X terminal - a “dumb” terminal that supported not just text, but a GUI.

For that it works ok… but for the primary graphics interface on a desktop machine it is utter crap. (Note that modern Unix OS derivatives such as BEOS and OS X do not work on top of X Windows, but rather support running an X server on top of their native GUI… it just works better having it that way around.)

elias:
Since you’re the resident Linux guy (am I correct?), maybe you could answer a simple question: Based upon this thread, I’ve come to a semi-conclusion that games written that run on Linux using LWJGL will more than likely run on Windows, but not vice versa, in the “unless you’re really careful” context?

I’m in the process of switching completely back over to Linux (again), but would still like my “creations” to run on multiple platforms with a minimal of rebooting (although I can foresee that happening anyway…) Actually, now that I’ve pretty much got 2 systems, I’m probably just going to keep one a strictly Win2k box (mainly for Adobe Premiere and After Effects, but can use it as a testbox) and the other strictly Redhat Linux 9. LWJGL could always use another Linux tester, right? :slight_smile: