Guardian [FINISHED]

Mojang stats were talked about here, but the actual page is down now:
http://www.java-gaming.org/index.php?topic=26377.0

[quote]- About 91% of Minecraft users have computers that support OpenGL 2.0+, meaning we can write games fully with the programmable pipeline (GLSL shaders) and start to safely forget about supporting or having a fallback for the old fixed function pipeline.

  • 51% of the Minecraft user base have computers with graphics cards capable of OpenGL 3.0+.
  • 38.8% of the Minecraft user base have computers with graphics cards capable of OpenGL 3.2+.
  • 34.2% of the Minecraft user base have computers with graphics cards capable of OpenGL 3.3+.
  • 19.6% of the Minecraft user base have computers with graphics cards capable of OpenGL 4.0+.
  • 8% of the Minecraft user base have computers with graphics cards capable of running the latest OpenGL version 4.2.
  • Intel cards are crap (yes everybody already knew that) and account for the majority of the 9% that don’t support OpenGL 2.0+.
  • Java 5 use has pretty much died with very few users still on that version of Java.
  • OS X 10.4 use is pretty much dead however OS X 10.5 still has significant market share.
    [/quote]
    And of course, OpenGL ES is pretty much just GL 2 with a few more features. This is why many of the “modern” GL 3+ tutorials are not great if you are looking to develop games for today’s casual market, and why I started my lwjgl-basics API and tutorial series. i.e. Learning the programmable pipeline in a GL 2.0 compatible context.

Okay, OpenGL 2.1 it is then.

I’m having a break after Ludum Dare so will start again in a couple of days.

Okay, lots of the code in this project was for things I had never done before. Ever.
In my Ludum Dare entry, I rewrote the whole engine, and actually wrote BETTER code.
So, I am going to clean up the Ludum Dare entry, remove anything that was not part of the engine, and rebuild the game on top of that.
This will take a few days if things go smoothly.

Also: Switched rendering to VBOs only. Apparently, VBOs DO work well with constantly changing data. Hopefully that will fix the previously reported crashes when I release the next prototype.

Now I will go and seriously think about what this game is about…

:slight_smile:

Guardian is now Finished (more or less)

Download is in the opening post.

There is not much. Basically you just defend the tower for as long as possible, collecting coins to help.
If you have 50 coins, you can heal fully by pressing H. If you have 150 coins, you can summon another Guardian by pressing the character select buttons while you are alive. (1 = Warrior, 2 = Archer, 3 = Assassin)
If the goblins get enuogh coins, they can summon orcs and trolls.
Remember that you can steal coins from the goblins. :wink:

I will tweak a few things if people want, but no new content will be coming, and if it does, it may be a while.

Don’t worry. The art will continue!

At the start of this project, people loved the pixel art. I will not let them down.
I am not stopping work on the spritesheet (or the engine). I will continue making small similar games until the engine is polished enough to do a big project and the spritesheet is complete.

If anyone requests, I will make the spritesheet available for free and the engine Open Source so that you can make the same kind of games (or just peek at the source) and use the art for your own projects (as long as I get some credit, because I don’t want the (rare, but still possible) situation where your game gets super-popular and I get fanboys screaming that I stole your art ;D).
Have fun people, and my next project will be shown soon.

CAN A MOD/ADMIN PLEASE MOVE THIS TO SHOWCASE

If you want a more complete-feeling game with similar gameplay (made with what is now the new engine) with similar art, go here to play my Ludum Dare entry.

Oh, and can the people that had trouble running this before please try again, as I switched rendering over to VBOs. I hope it works now.

Same thing. GTX 580

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x5c13b133, pid=4132, tid=3616
#
# JRE version: 7.0_09-b05
# Java VM: Java HotSpot(TM) Client VM (23.5-b02 mixed mode, sharing windows-x86 )
# Problematic frame:
# C  [nvoglv32.DLL+0x77b133]
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.sun.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

Maybe time to show us some of your rendering code… :point:

EDIT: Testing it here on my Mac Air, seems very laggy. Maybe something to do with your VBO usage.

Entity rendering code:

There is no way that an entity would not have changed between frames, so there is no point keeping the buffer.


		int buffer;
		buffer = glGenBuffers();
		
		FloatBuffer buf = BufferStorage.getBuffer(192);
		
		Vector2Float pos = this.pos.clone().subtract(new Vector2Float(0F, size.y-0.5F));
		
		float[][] vertices = new float[][]{{pos.x-0.5F, pos.y+0.5F}, {pos.x+0.5F, pos.y+0.5F}, {pos.x-0.5F, pos.y-0.5F}, {pos.x+0.5F, pos.y-0.5F}};
		float[][] texCoords = QuadTransform.translate(animator.getTexCoords(), tcOffset);
		float[] color = {1, 1, 1, 1};
		
		buf.put(vertices[0]);
		buf.put(vertices[1]);
		buf.put(vertices[2]);
		buf.put(vertices[3]);
		buf.put(vertices[2]);
		buf.put(vertices[1]);
		
		buf.put(texCoords[0]);
		buf.put(texCoords[1]);
		buf.put(texCoords[2]);
		buf.put(texCoords[3]);
		buf.put(texCoords[2]);
		buf.put(texCoords[1]);
		
		buf.put(color);
		buf.put(color);
		buf.put(color);
		buf.put(color);
		buf.put(color);
		buf.put(color);
		
		buf.flip();
		
		glBindBuffer(GL_ARRAY_BUFFER, buffer);
		glBufferData(GL_ARRAY_BUFFER, buf, GL_STATIC_DRAW);
		
		glVertexPointer(2, GL_FLOAT, 8, 0);
		glTexCoordPointer(2, GL_FLOAT, 8, 48);
		glColorPointer(4, GL_FLOAT, 16, 96);
		
		glDrawArrays(GL_TRIANGLES, 0, 6);
		glBindBuffer(GL_ARRAY_BUFFER, 0);
		
		BufferStorage.addBuffer(buf);
		glDeleteBuffers(buffer);

If anyone knows a way of rendering constantly moving data, let me know, as I think this is extremely inefficient.

[icode]BufferStorage[/icode] stores an arraylist of unused buffers. [icode]getBuffer(int)[/icode] returns a buffer of the size requested, whether it finds one in the list, or creates one. [icode]addBuffer(FloatBuffer)[/icode] adds an unused buffer to the arraylist.
[icode]QuadTransform[/icode] just does things like translating, rotating etc. to a float[][].

World rendering code is generally the same, but it saves the buffer as the data is mostly the same throughout the game.

Have you got the full dump?

I hope it’s not too long

EDIT: Since you already looked at it, removed for better readability

Same as the other one I got.

There is nothing special (as far as I know) about Sky.render(), it just happens that the sky is the first to be rendered.

EDIT: I start to get lag after a few minutes as well. Really need to optimise that, somehow.

Okay, I uploaded a new version which render all entities in one pass.

Hopefully that fixes lag issues, not sure what to do about crashes though…

Yup, you need to pass all entity data in one go to the GPU – aka create a “sprite batcher”.

Other optimizations - use STREAM_DRAW, don’t create new float arrays every frame, and ensure your transformation stuff is not slowing anything down. Also make sure to glEnableClientState and glDisableClientState for position, texcoord and colour since you are using fixed-function. And I’m not sure why you are deleting the VBO each frame… ???

I only delete the VBOs for entities, as they are constantly changing.
Chunk VBOs stay until a tile changes.

Does glEnableClientState (or disable) affect anything?

Unless I’m mistaken, glDeleteBuffers is like glDeleteTextures. It tells OpenGL to delete the object when it’s no longer used. So in this case it’s a useless call, since you are still using the buffer every frame. Still, it might screw things up depending on how the driver implements it.

Just create one buffer (glGenBuffers) at the start of your game, give it data every frame, and then once you are done (i.e. when your game is closing), delete the buffer with glDeleteBuffers.

Oh, didn’t think of that.

Fixed. Now it just resets the data every frame, like I wanted.

Okay, new release available.

People who had trouble before may want to try again, as I have been fixing things.

Also some performance optimisations.

Congrats, it works now :smiley:

A game is never truly finished… you have alpha, beta, then live… but unless it’s put on a type of media and shipped to a customer that has no ability to update, the game is never truly finished

Still, nice work :smiley:

:slight_smile:

-Pickle

That’s wrong think.