libGDX converting from LWJGL

Hi,

I have a world generator that works perfectly in LWJGL, and am now trying to port it to libGDX for many reasons,

here is the LWJGL code: http://pastebin.java-gaming.org/d5037981487

and here is my attempt at porting to libGDX, I know I am not rendering it properly, because it goes through everything and when it gets to render, it throws a NullPointerException.
http://pastebin.java-gaming.org/5037804278a

Any help is appreciated,
Thanks,

  • Dan

Its pretty obvious… Null pointer exception means you’re trying to access object that is set to null. Line number at which it is thrown would be nice.

Also, unless you’re planning on making the game for more than desktop, I would suggest to use only LWJGL. From my experience at least, Libgdx is a mess to work with…

Don’t go around giving false information, your ability to use it and it’s functionality are 2 different things.

There is nothing wrong with libgdx, usual PEBKAC error

Why do you think it’s a mess to work with?

He could be used to using something else, I tried LWJGL and felt like smashing the screen.

I just want to know why think that it’s a mess :wink:

I Totally disagree.
Libgdx is very easy to use.

“public ShapeRenderer shapeRenderer;”

You didnt instantiate it.

Did you?

I can make a little list of stuff that I really hated…

  • Multiple projects. Always have to wait 5 seconds after editing image or press refresh on android project.
  • Cannot access original LWJGL GL classes. They put stuff into glcommon, gl11, gl20. They didn’t put everything.
  • Cannot use GL11 and GL20 at the same time.
  • You only get 1 method for updating your game: only render, no tick/update method. Cannot render at maximum speed. Always capped at 60 fps.
  • I couldn’t find a way to measure performance.
  • Shaders are stupid or what? Gives errors when LWJGL doesn’t. I mean, when I load shader into Libgdx, it “works” differently than in LWJGL. Variables that do not affect the output of the shader are not compiled to the shader. This is probably to boost performance, but is really annoying. Could have just given warnings telling that some uniforms are not used.
  • Sprite class doesn’t have anti bleeding built in.

These are the most annoying things that are just keeping me from using Libgdx…
I’m not saying Libgdx is shit… I just don’t like it… It makes developing more like a drill to me. There is always that stupid something that doesn’t need to be there…
Of course, Libgdx has much more upsides than it has down ones… I just happen not to like it. Its like choosing operating system… You don’t choose it, because its better. You choose it, because you like it more.

Multiple projects is a feature. Having it all embedded into one project is way too messy. And pressing refresh is an eclipse problem, not libgdx.

You choose a tool based on what it provides for you, i.e, based on your needs. Learning the peculiarities of a library/tool that provides the things you need is what most of programming is about - especially game development.

If you don’t like it - do it yourself. Perhaps in 10 years you will be able to reinvent the bicycle.

Additionally, comparing libGDX to LWJGL is like:

No offence, but

  1. Multiple projects -> Eclipse’s problem, if you don’t use the internal storage you don’t have to wait
  2. I havent missed some GL-stuff yet ( for example glPushMatrix isn’t available but libgdx has it’s own matrices, they are more powerfull)
  3. “Cannot use GL11 and GL20 at the same time.” Why would you do that?
  4. Just wrong, you can have your own render-loop.
  5. Count your fps.
  6. I don’t see the problem.
  7. That’s the problem of the texture you created and the texturefilters you use.

You don’t have to use it, but don’t say that it’s messy if you haven’t really used it :slight_smile:

I might be in the wrong, but are you suggesting me to do multi threaded game, because Libgdx couldn’t make 2 separate methods?
Did you try counting fps? You will never count higher than 60.
Well which texture filters am I supposed to use to get rid of bleeding? Would be really willing to learn those…

As far as programming goes, use LWJGL.
As far as game development goes, use LibGDX.

To each their own.

cfg.vSyncEnabled = false;

Instant 1600 FPS, not that you need any more than the default 60, and excessive frame rate only burns up your graphics card.

Google libgdx vsync, and read the docs before making assumptions.

Thanks I will try that.

EDIT------------
60fps with screen tearing :)))))

Why “multi threaded game”, I think GL20 and GL11 are seperate because of cross-plattform performance.
If you really want to, you can add lwjgl jars and access interfaces directly.
As you can have your own render-game-whatever loop you can get as much fps as you want.
Vsync is enabled by default because it needs a lot of energie and mobile-bateries won’t do it long.
If you have texture-bleeding you can reduce it by using texturefilter-nearest instead of linear (if you texture isn’t resized to strong).
Of course it doesn’t solve the problem completely, but in most of my cases it helped and if you add padding to your textures it works fine. In addition Sprite.class is just made for drawing a textureregion you have to care about the accuracy yourself.

  • Multiple projects. You can have a single project if you target a single backend.

  • Always have to wait 5 seconds after editing image or press refresh on android project. Load your assets from the filesystem, not the classpath, to bypass this Eclipse refresh issue on the desktop. This is not a libgdx problem.

  • Cannot access original LWJGL GL classes. You can access the LWJGL GL classes directly. You aren’t trying.

  • Cannot use GL11 and GL20 at the same time. ::slight_smile:

  • You only get 1 method for updating your game: only render, no tick/update method. Create your own method and call it.

  • Cannot render at maximum speed. Always capped at 60 fps. On what backend? See vsync.

  • I couldn’t find a way to measure performance. Gdx.graphics has FPS counter. Otherwise you are complaining about something you fail to do.

  • Shaders are stupid or what? No idea.

  • Sprite class doesn’t have anti bleeding built in. Sprite doesn’t manage texture regions. See libgdx’s texture packer.

  • These are the most annoying things that are just keeping me from using Libgdx. I get the impressions your issues aren’t with libgdx.

  • It makes developing more like a drill to me. There is always that stupid something that doesn’t need to be there. I’m pretty sure there are few things that are in libgdx for no reason. It’s OSS, contributions are welcome.

  • Of course, Libgdx has much more upsides than it has down ones… I just happen not to like it. Fair enough. :slight_smile: Good thing you didn’t pay anything for it. :smiley:

Its not fair to compare LWJGL to LibGDX, but its also not fair to bash LWJGL. Of course its difficult to use, get over it or don’t use it. I’m quite sick of people bashing libraries and saying this one sucks etc… Use what you want and get over it. Thank you.

@Nate I don’t want to look like a dude who is a noob and tries to look pro, but I tried turning off vsync, still 60 fps.

Ofcourse I can make my own method for updating, but you didnt get my point. I want to get 1000fps and 60 updates. I didn’t manage to do that. From my experience, it would only be possible by implementing multiple threads to run a single game.
That’s what I meant by multiple methods… So render methods gets called 1000 times a second and update would get called 60 times…

As mentioned in the other thread, this is done by the shader compiler, which is part of the graphics driver not of the library that you are using (http://ogltotd.blogspot.de/2007/12/active-shader-uniforms-and-vertex.html). The fact that this might work under LWJGL is because you don’t check for your assigment going wrong, which lidGDX seems to do for you. You can ignore this as long as you like, but why don’t you simply fix your stuff by not assigning values to uniforms that you don’t use instead of blaming the library?