[opengl] Why are LWJGL enums as integers?

Not sure if this is the right thread, but why are LWJGL’s enumerators, integers? It would have been easier to code them as enumerators, or does JNI not allow Java’s enums?

EDIT: Changed the title to ‘LWJGL’

I think you kinda got the reason.
It would at least complicate it a lot.

Why not just give the enums a parameter (probably an integer) of what they would represent through the JNI?

That would require extra steps, when the current system is just as readable.

CopyableCougar4

Possible, but not necessary.

in C the gl “enums” have values. To keep LWJGL consistent it makes sense to give them values, though a wrapper from the values to enums could be made relatively easily.

Worth pointing out that Enums were not present in Java until 1.5. JNI is older. Lwjgl is older. OGL itself is also older, and not Java.
OGL is also probably designed with older idioms, and in a way to be interfaceable with other languages. That idom is actually pretty common.

EDIT: ah, you changed the title.

If you miss some rare/new extension you can just use raw integer.

AFAIK, you can’t bitwise “|” enums in Java. That’s probably the reason why they used ints in favor of enums.

offtopic:

enums are just ints to the JVM, and bitwise OR is accessible through EnumSet.of(…, …). This operation is supposedly just as fast as ORing ints, as the EnumSet is an int too (or a long, for enums with more than 32 values, and finally it falls back to some primitive array)

ontopic:

using enums would create a mess in the API, as every system theoretically supports another subset of each enum in each gl-call.

+1

i wouldn’t be able to mix-in jocl (opencl) with jwlgl (gl-shared display and another cl-device, what the heck) without making a mess in my api code.

numbers are just numbers whatever context you’re in.

Having a GLEnum type would actually be at least something of a boon to avoid mixing up ordinary integer arguments with what should be, well, GL enums, however, because of the way GL works - arbitrary extensions adding symbols on an ad-hoc basis at run time - you can’t really use Java enums in the way they were intended, and instead you’d have to make a GLEnum class that wasn’t actually an enum. Then there’d be the irritating need to extract the int value from that enum from every parameter in every API call. It’s probably not worth the effort.

Cas :slight_smile:

Just take a look at the design of OpenTK. It uses enums for everything. Let’s see a simple example of clearing screen in C++ OpenGL.


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

The same in OpenTK translates to


GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);

Does it make OpenGL easy or tough? I’d say it makes confusing because most of the tutorials are in the native C++ code and translating that is confusing.

Can you expect what [icode]GL_VERSION[/icode] a frequently used one while debugging is translated to? It is translated to [icode]StringName.Version[/icode] but we beginners expect it to be something like GL.VERSION or something.

It’s not really useful.

One of the improvements in LWJGL 3 is javadoc generation for all bindings. This includes links to enums that an OpenGL function argument supports. Imho, it makes a huge difference in productivity and it’s not an issue anymore that GL enums are integers. I’ve tried to list all core GL enums and plan to keep them up-to-date as new versions are released. I’ve also included several extension enums, but that’s obviously a huge task and requires help from the community. Feel free to submit documentation patches at any time.

Great work Spasi. Am waiting for the release of LWJGL 3. Any expectations on release date?

There will be a usable build for public testing after GLFW 3.1 is released.