Mac VS. PC

Okay I have a simple question that’s been bugging me for a while. I’m new to java game development and I made a simple game on a PC. All it does is you control a little ship and move around shooting the alien ships.

When I compiled on PC it worked just as expected, but when I ran the jar file on my Mac, everything in the game was happening a lot faster!

One day when I make a full-blown real game, I would like it to be able to run the same on both Mac & PC. So my question is, is there any way to have it run the same? Maybe detect what OS its running on and change the code accordingly? Any help would be appreciated. Thanks :slight_smile:

  • Mac

https://dl.dropbox.com/u/4826855/img.png

EDIT:
So what I’ve learned from ya’ll:

  1. [quote]The mac is giving a better performance then your windows.
    [/quote]

  2. I could either use Delta or Fixed Rate.

  3. princec(Cas) makes [quote]hundreds of thousands of dollars! xD
    [/quote]

  4. [quote]Delta - complex (just try doing collision detection properly, go on), mercy of floating point, jerky movement. Absolutely horrible for 2D, in any form.
    Fixed rate - easy, can use integers all over the place, glassy smooth movement, perfect for 2D.
    [/quote]

  5. [quote]The basic idea is to link your game to real time, not computer execution speed.
    [/quote]
    Thanks very much for all your help :slight_smile:
    P.S. Any Code examples on how to actually do this would come in handy xD

  • Mac

The mac is giving a better performance then your windows, you can fix this by using delta, which you will pass to your update methods, and it will use it to update the game at pretty much the same speed no matter how good your computer is.

edit:
Delta is a variable that changes every time, and will become bigger the slower the computer is going, this means when your doing your movement speed you will need to use a double, something like 0.25 or so as the movement speed, and when your adding to x it will be like

speed = 0.25

x += speed * delta

Why does everyone insist on using delta for logic :emo: It’s far far harder to get your head around than fixed-rate logic.

Cas :slight_smile:

Another disadvantage of delta for logic is that it results in non-deterministic code.

With a fixed rate, you’ll be able to consistently reproduce state (or well: bugs) with every run of your code.

For fixed time steps you have to compensate by interpolating whatever “overhang” you have (e.g 14ms passed, your timestep is 10ms, what do you do with the other 4ms?). This can get rather involved. Delta time doesn’t need that.

I prefer fixed time steps plus interpolation myself exactly for the fact that it makes games deterministic as Cas and Riven said.

I usally smooth out my deltas too by just calclulating the average delta. This gives constant deltas and thus deterministic results.

Huh? The deltas are still going to differ between machines and runs, so you won’t get deterministic results.

Unless you meant something else by averaging the delta?

I loves me a good delta vs fixed rate discussion! Here is my wisdom:

Delta - complex (just try doing collision detection properly, go on), mercy of floating point, jerky movement. Absolutely horrible for 2D, in any form.
Fixed rate - easy, can use integers all over the place, glassy smooth movement, perfect for 2D.

I make 2D games in Java, that make hundreds of thousands of dollars! But don’t listen to me, because I don’t want the competition.

Cas :slight_smile:

That’s why you didn’t put up code as example :point:

A more careful look at my code will reveal numerous bugs and gotchas that will have you wasting loads of time trying to figure out what it’s doing only to discover it’s just broken :slight_smile: My code is a deadly trap. I leave it here as a warning to others.

Cas :slight_smile:

Now I’m scared :persecutioncomplex:

EDIT: Are you saying that instead of using delta-time I should just pin the frame-rate to a constant 60FPS and dont use delta-time at all?

Finally some people are talking sense.
Seriously most guys will PLEAD that you use delta timing… and its horrible
I sat there DAYS to get it to behave smoothly - it just wont happen - not matter how smooth you think it is, any spark in delta timing with jitter everything in a fast action game.
So on desktop I dont use delta anymore.

on mobile you kinda have to watch out because some devices will do 20-30 or 50 or 60 FPS/hertz Vsync

As for mobile I never do display sync :persecutioncomplex: (or libgdx has done that for us underhood?! ::))

Yes. Or whatever constant rate you prefer. 60 is a good number for desktops as 99% of all monitors out there nowadays will also be locked to 60Hz as well.

In a 2D game, the best and simplest advice I can give about varying frame rates is… just make sure that your game actually nearly always achieves a reliable 60FPS on the hardware it’s targeted to run on. If it can’t … just let it run slower and tell people to upgrade, or write faster code / do less fancy effects.

Cas :slight_smile:

Now after looking for a way to cap the FPS in libGDX all I find is this (libGDX wiki):

//Good advice right at start: You should never code anything dependent on the framerate! Instead it is a good idea to make it depend on the time e.g. one of these:

 System.currentTimeMillis();
 System.nanoTime();
 Gdx.graphics.getDeltaTime();

I guess there’s no “setTargetFramerate();” … I already miss Slick2D’s simplicity :frowning:

You can make it dependent on the frame not frame rate.

I in fact have a graphics decoupled from the logic loops. But its fix rate logic. So a frame drop does not put you one frame behind. Since the logic loop requires very little cpu compared to the graphics loop. Keeping that ticking over at a fixed and constant rate is pretty easy.

Also do have a look at LWJGL’s Display.sync(int fps) method (needs to be called every frame). Not sure if LibGDX exposes this through its API.

Yes, fixed logic/simulation rates is a really really wise route. Rendering rates…do whatever you want.

I don’t think this has been mentioned but with fixed rates you can do the easiest motion models and nothing gets hinky. Variable rate requires very complex motion models and they’ll always be hinky.

To answer what you want to do:

The basic idea is to link your game to real time, not computer execution speed.

For more methods how to achieve this, read the thread again, and pick up the frequently occuring keywords :slight_smile:

Added this to the render method:


long lastLoopTime = TimeUtils.nanoTime();
	final int TARGET_FPS = 60;
	final long OPTIMAL_TIME = 1000000000 / TARGET_FPS;  

	@Override
	public void render() {			
		System.out.println(Gdx.graphics.getFramesPerSecond());

		long lastLoopTime = TimeUtils.nanoTime();

		try{
			Thread.sleep((lastLoopTime-TimeUtils.nanoTime() + OPTIMAL_TIME)/1000000);
		}
		catch(Exception e){
			e.printStackTrace();
		}
	}

and it runs at 60/61. Scrapped the deltatime and let the entities move for straight 3(int). I have to say the outcome is pretty similar to before, it runs very smooth but I still seem to get a tiny jitter every second :frowning: