C++/Java Engine without GC in graphics

So I’m thinking about building this C++ engine, but I really don’t like working in C++/C, so I’m thinking of having the core engine be C++ and then you script the game in Java very similar to Unity but with the rendering thread in C++. Has anything like this been done before.

Also do you know if there are many other small open source game engines like this one out there:

Use a language you are comfortable with. You won’t gain any immediate advantage just because you use C/C++. There will be some hurdles and problems that you will need to solve.

Don’t.

2 Likes

Well, you’ll probably gain an immediate disadvantage! :wink: Yes, well written C/C++ might outperform Java, but you’ve got to get there first - it’s easy to write C/C++ that underperforms Java.

And GC is probably not as much of an issue as you think it is.

As for “script” in Java … :-\

The issue of Java’s GC overhead has been around for a long time, any stuttering in game is usually automatically blamed on it (even though it might be something else). However recently there was a really interesting draft JEP for adding an option in the JVM to turn it off (i.e. allocation only GC).

This could be particularly interesting option for games as it could be used to test if any lag is being caused by the GC or where your code does not produce any garbage.

Well, I can halve my frame rate just by turning on G1GC… it does indeed still have noticeable irritating effects when you start really pushing things a bit.

Cas :slight_smile:

Anyone can halve the rate of anything by turning on G1GC :persecutioncomplex: ;D

Seriously, I hope they improve things a lot by the time Java 9 is out the door.

Ok, so I figure I will make the rendering in C++ and only interact with it asynchronously from Java.

That way I can avoid the GC pause but still hot-deploy changes directly into the running engine saving alot of development time.

And I need the performance of C++ because I want to build a VR MMO.

I already have the collada to OpenGL skinned mesh C++ code almost done (it works with a perfect example animation, but not with a real human made animation because there are bones that are unused and such)…

So the usage would look something like:


public class Meadow extends Game {
    Animal fox;
    public void init() {
        fox = new Animal("model/fox.dae", "texture/fox.tga");
        scene.add(fox);
        camera.distance(3f);
        camera.position(fox);
        camera.rotation(mouse.rotation());
    }
    public void tick() {
        if(key.up() && fox.velocity().x == 0) {
            fox.velocity().x = 1;
        }
        if(!key.up() && fox.velocity().x != 0) {
            fox.velocity().x = 0;
        }
    }
}

And the C++ thread would just render automatically.

So basically a very high level control game engine from the Java perspective.

I’ll see where the physics will happen, probably in Java first, then I’ll move it when/if it becomes too slow.

So atleast the system will use 3 cores since it will have 3 threads interacting asynchronously.

Java -> Physics -> Rendering

Some progress has been made:

Looks good :slight_smile: Has it paid off performance wise?

I wanna see some stress tests ;D

Performance is just one reason.

More important are:

  1. Portability: I realized that if you limit your Java code to the exposed C++ library; a equivalent of Unitys C#2CPP would be really trivial to make. This way you can hot-deploy code with Java during development (see 2) and then release binaries for all platforms without bloat.

  2. GC and single threading: I’m still convinced that the biggest flaw in Unity is that the same thread that renders also executes the MonoBehaviours! You need an async. API between render thread and game scripting thread! I’m unsure how/if it will work but I’m going to try it in C++ first and then if it works well enough I’ll add a Java server with a JNI interface for hot-deployment of game code which will allow <1 second turnaround even after the game is released with a lot of resources, at which point Unity becomes unusable.

So the point is not to avoid the GC stutter, but rather allow for a better programmer (and later content creator) workflow even after you add alot of content (code)!

Today we don’t build games and then throw them away, we build games that last/are maintained/improved for decades… eventually we will build games/worlds that never die because peak Moore’s law makes return on new hardware marginal!

Projects are more probable to fail because of technical debt and slow iteration times rather than obsolete technology.

Hmmm. I’m not sure I got all arguments here. And I’m not an advocate for using jvm for games (although I’m doing it exclusively myself), but I think you can get rid of all the negative things that were mentioned here.

  • GC: You can make your engine garbeless. It’s hard, it’s easier with C++ because you can make and reuse your own heap, but it’s doable. Lack of value types is an issue, but can be avoided with object pooling and/or excessive ByteBuffer usage.
  • Hot reloading: There’s a ton of mechanisms to load, compile, run code during runtime. Most certainly not with zero garbage, but I doubt it’s toooooo important for example for a game editor. For runtime, all the assets can be compiled and packaged at build time.
  • Portability: I don’t think there’s a big difference, but with C++ you need to cross compile, so it could be a disadvantage.
  • Maybe totally personally biased, but weaving two platforms together is almost never a lot of fun, so staying either completely C++ or completely Java/JVM seems to be more fun and less overhead.

EDIT: I may not have the best implementation but my hobby engine (quite advanced) uses triple buffering for renderstate, where the state is mostly backed by either copyable objects or ByteBuffers… the renderer runs in its own loop completely independent, passes said ByteBuffers to OpenGL (or your favourite API) and never allocates anything. Of course only possible if you model you renderstate in a way that is mostly mutable/reusing memory.

What @h.pernpeintner said.

I see virtually no advantage to using C++ in games regarding performance because frankly I’m not at the level of being able to push a AAA-agenda graphics pipeline full o’ graphics. At the level we’re operating at, there is no noticeable difference in performance between C++ and Java but we don’t half get stuff working a lot faster. GC is only an issue when you screw up in some spectacular way (haha, remember once upon a time how Java2D would create tons of contexts that had finalizers in that would eventually stop a game’s main rendering loop while they got cleared…). Interfacing between C++ and Java is fraught with hassle and irritation and almost never pays off.

Cas :slight_smile:

Here is a stress test:

That’s 3000 individual (non-instanced, so they can look different, have different animations and be controlled by a separate player in a physical world) skin mesh animations at 60 FPS. Edit: 48 bones and 2500 triangles.

Compare that to Unity: https://blogs.unity3d.com/2018/04/16/animation-instancing-instancing-for-skinnedmeshrenderer/ (apparently on iPhone 6 but still)

Now I like working in C/C++…

Yes, but what have you actually tested there?

hraOubmmBK0

There are 16,000 bots in this scene, which is being rendered at 60fps, though obviously a couple of hundred are actually visibly rendered in the display. The terrain alone is 8 million triangles. The particles all bounce off the terrain, voxel-perfect. It’s all pure Java.

Unity is itself written in C++. The scripting component is generally only a trivial part of a frame’s rendering time.

Cas :slight_smile:

Also, the benchmark figures they gave for Unity there are for it running on an iPhone 6!

Cas :slight_smile:

Ah, that explains part of their bad numbers, still my engine is probably one order of magnitude better.

That Java thing does not render individually (skinnable, animated and controllable) physics enanbled entities right? A couple of hundred is not impressive at all. I can’t see the movie because flash does not work on my phone.

Terrain can be one draw call, the challenge is animated skin meshes with bone matrix multiplications that need to be cache friendly and SIMD calculated.

I’m going to use (Joint Parallel) Java on the server and C+ (C syntax compiled wit gcc) on the client.

The character has 48 bones and 2500 triangles, I have inlined the bone rotations and I’m completely blow away by the performance.

I think there is a part of the bigger picture that gets lost in the bragging race. When your client uses 200-300 watt that means adoption kills the planet, PUBG with one million concurrent players uses 1/100 of the three gorges dam!

My engine can run a MMO on a raspberry pi zero at 1 watt! Thats 1 MW instead of 200-300 MW, more than two orders of magnitude.

People will learn to respect orders of magnitudes now that they will turn against us instead of working for us.

So now I’m not going to add a scripting layer, keeping it simple is first priority.

https://youtu.be/hraOubmmBK0 to watch it on Youtube.

With all 16,000 robots on the screen simultaneously walking about, it drops to 30fps… meh.

Cas :slight_smile:

(only a couple of hundred visible) - are you animating the bots that are not visible?

On my animated .gif many of the 3000 are past the drawing distance but I’m still calculating the bone rotations on the CPU.

Yes, the invisible ones are also animated. Point is though that the GPU is the limiting factor here, not Java: my GPU is jammed at 100% with just a couple of hundred droids, leaving the CPU with plenty of spare cycles. C++ is rarely worth the effort for the amateur game developer! Just about any other slightly higher level language will hugely increase productivity and rarely contribute a terrible decrease to subjective quality of the finished article.

Cas :slight_smile: