Problem with depth buffer on Linux

I have an application that is running fine under win32 but when brought over to Linux no writes are occurring to the depth buffer (setting glDepthMask() does nothing). glGet(GL_DEPTH_WRITEMASK) returns 1 as expected. Flipping the glClearDepth() and glDepthFunc(), which would normally produce a blank dispaly, shows my incorrectly drawn objects.

Linux 2.4.20 with nVidia GeForce2 with nVidia driver 1.0-4349.

Thoughts?

Well, as with going to the doctor, as soon as I posted I determined what is amiss and I feel a lot better.

For reasons unknown, it was not OK to pass 0 for the depth into the constructor of GL().

Does any Linux user out there have this problem? And what in my X config would cause this?

I should also point out that all DisplayModes return with “freq” equal to 0. Is this common for Linux or just all part of the same problem?

Thanks.

It’s perfectly OK to pass 0 for depth buffer size. If you don’t require depth that is. The GL constructor will match your requirements as close as possible and will not go under them. So if you specify 0, it can return any buffer size from 0 to 32 bit, as it sees fit.

The frequency stuff in linux is deliberate - you can’t change it anyway so I didn’t bother finding the correct number. I could just as well have passed 0 for bit depth too - that’s also not possible to change.

  • elias

(btw hello rgrzywinski - I’ve noticed you’ve posted some suggestions up on sourceforge’s bug list for us)

Cas :slight_smile:

Heya Cas! Yeah, I’m a bit of a stickler for code consistency and quality. I’m not implying that you’re not producing good code – just pointing out a few potential gotchas.

Elias – the beef that I have with the buffer depth is that passing zero on win32 provides me with a functional depth buffer (actual size unknown) whereas it does not on Linux. These inconsistencies are the root of all male coder early hair loss (not genetics as orginally thought).

As a follow up question: If I pass in 32 for the buffer depth and the system only supports, say, 8, and the constructor “will match your requirements as close as possible and will not go under them”, then what happens? Do I get the very RuntimeException that I have posted a bug about?

Thank you.

The pixel format behaviour is a side effect of the fact that you specify minimum requirements and let the system decide the best format. So the linux behaviour is perfectly acceptable and yes, specifying requirements that cannot be met will result in an exception. In any case, we can’t do much to change that behaviour as it is pretty much built into opengl.

Actually we tried returning a list of valid formats like the display mode query a while ago, but that much harder to do right.

  • elias

I hear you on the minimum part …

But the question still stands:

Why does passing zero on win32 provides me with a functional depth buffer (actual size unknown) whereas it does not on Linux?

Still because of the “minimum” part - specifying 0 as your requirement is minimum. Therefore, the driver can return any depth size, from 0 (nonworking depth buffer) to 32 (working depth buffer). The case is special from the application point of view because it makes the difference between no depth buffer and some depth buffer, but from the driver perspective it’s perfectly consistent.

  • elias

sigh

For all of us out there that have male coder early hair loss I will say that “from the driver perspective it’s perfectly consistent” is your (the coder of LWJGL) problem and “from the application perspective it’s perfectly consistent” is what you should be presenting to the application developer.

I mean no ill blood by this.

My whole “bunched up briefs” issue here is that the application interface should be consisten regardless of what the driver returns. If by default the Linux driver returns 0 and the win32 driver returns >0 then it’s up to LWJGL to define a consistency between them so that the application developer does not have to concern themselves with the platform differences.

You could also not ship with examples that all have zero for the buffer depth :wink:

Thank you for your quick responses.

Very well then, how would you like the interface to be? Considering that LWJGL is merely trying to be a simple interface to certain crossplatform techologies, not a brand spanking new and fancy interface defined in terms of those technologies.

  • elias

Again, no ill blood here. You guys have done a great job in a short amount of time.

Consistent. That’s all I want. I have choosen Java to help alievate platform dependencies and by not masking what a platform’s driver is returning in a consistent fashion then I am forced to perform platform detection and the vicious circle begins.

In this particular case, the depth buffer is doing two things: on Linux 0 is 0; on Win32 0 is non-zero (perhaps the hardware’s limit). If you choose that sending 0 implies a minimum, then the Linux driver should be probed for its maximum (which may need to be done by trial and error) and that should be used. If you choose that sending 0 implies zero, then Win32 should be set to 0.

… or at least add a “NOTE” to the javadoc on the constructor that warns people of the inconsistent behavior and the consequences of choosing a value out side of the valid range. :slight_smile:

Thanks again for your help and I’ll go bugger off now ;D

Well actually win32 can just as well choose to return no depth for the value 0 if a different gfx card, driver or even pixel format combination is used. But that’s just politics and notation.

And technically, we are indeed consistent with our specs. Minimum values are specified and anything matching or superseeding those values can and will be used. That’s a consistent behaviour on both linux and win32.

Now, if we chose to implement stricter consistency you gave two possibilities:

  1. Force 0 to mean 0. That is, make 0 a hard value representing not a minimum value but instead a “disable” of that particular buffer.
    That’s impossible to do because it is perfectly acceptable for a driver to not have any pixel formats with no depth buffer. Then we would be forced to fail the creation even though the driver has a valid pixel format meeting your other requirements.

  2. Force the values to max or at least nonero, if possible. This can be done, but that introduces the problem of wasted resources: The pixel format containing the depth buffer that isn’t needed might be slower than the one without. And nobody likes to know that they might waste precious resources. Especially not in the OpenGL world.

  • elias

Your points are valid and well taken.

Thanks again for your prompt support.

Where absolutely possible we must provide the same behaviour as OpenGL normally would, and explain this in great detail, along with its ramifications. We are very consistent about how construction of GL works - it does exactly what it says. It’s up to you to go the last mile and figure out the strange issues that arise. The behaviour you mentioned is by no means limited to the difference between Win32 and Linux - different drivers on Windows will have different results. It’s just how OpenGL programming works. The good thing is that rather than worry about bugs being caused in our Java binding we can draw on the vast knowledgebase of existing OpenGL programmers who already understand this behaviour.

Cas :slight_smile:

Hi
I’m glad I found this thread, cause I was getting worried, just looking at lwjgl for the first time, and the demos on linux didn’t work right (because of the depth buffer being 0), I agree that 0 is a valid argument and the functionality having provided 0 is up to the OS, what I would say, is that for the examples it might be worth setting it to 1 in BaseWindow.java so that others don’t spend a morning trying to compile lwjgl (I still can’t get it to, but don’t care now as it works)

Changing the default depth buffer to 1 in BaseWindow won’t cause a problem and will mean that you will always get some depth buffer whilst still handling small depth buffers on older cards (don’t think there are/were any smaller tha 8, but you never know :))

Cheers

Endolf