Enums vs ints

I’ve written a simple microbechmark (dont’ flame me) to have a guess at the cost of using enums vs int constants.

so my enum is defined like this :

public enum Constants { E1(10), E2(20), E3(30)…;
private final int value;
Constants(int value){
this.value = value;
}
}

and the corresponding int constants are :
private static final int I1 = 10, I2 = 20, I3 = 30…

then two loops computing million times things like E1.value + E2.value + E3.value …
and I1 + I2 + I3


now the results :
hostpot client : int constants are twice as fast as Enums.
hotspot server :

  • everything seems precalculated by hotspot (result time near 0) for int constants. I’ll have to write a more tricky bench.

  • Enums are now 4 times faster (making them 2 times faster than the int constants of hostpot client).
    ==>
    I know the poor value of that kind of benchmark, but it shows two things :

  • enums are (for me) worth using it to replace int constants, in API like Open GL (thus adding type checking to GL commands).

  • too bad hotspot server doesn’t recognize that constant pattern (an enum IS constant and its member field IS final : should be optimized like those int constants).

Any thoughts ?

Lilian

  1. Hurray for Enums!
  2. You won’t be able to use them for GL because the presence or non-presence of extensions changes the set of constants and combinations that can be used anyway.

Cas :slight_smile:

Well, with the java GL bindings i know, to use an appropriate extension, one has to use the integer constant generated (or written) by the binding tool.

So what’s the difference with enums ? when an extension is added to the set of managed extensions, enums could be added altogether.

Let’s look at a basic example :

gl.glClear(GL.GL_COLOR_BUFFER_BIT);

could be replaced by

public enum ClearConstants {
COLOR_BUFFER_BIT(GL.GL_COLOR_BUFFER_BIT),
DEPTH_BUFFER_BIT(GL.GL_DEPTH_BUFFER_BIT);

}
this would give us :
gl.clear(ClearConstants.COLOR_BUFFER_BIT);
or with static import
gl.clear(COLOR_BUFFER_BIT);

Then (another day), GL_STENCIL_BUFFER_BIT appears. It is added to the bindings as another integer constant and as an enum STENCIL_BUFFER_BIT(GL.GL_STENCIL_BUFFER_BIT) to the typesafe enums.

What’s wrong with that ? (except the manual part of it : affecting a new GL constant to the right enum ?)

Lilian

The issue is that the set of allowed constants is determined at runtime by GL, which means compile time checking is mostly a complexity best avoided. It’s really, really complicated when you try to do it. The best you could do is put all of the GL constants in one GLenum class to prevent people using ints (there are a few API calls where it’s easy to confuse the two and get strange results which are a pain to debug).

Cas :slight_smile:

Except sometimes you want to do things like GL_LIGHT0+4 and so, and enums would break that code.

Aye, there’s a few places where you do that kind of thing. So boo! for GL enums!

Cas :slight_smile:

Enums can have methods, like : LIGHT_O.add(4).

:wink:

Lilian