To run the display of a GLCanvas I used a Timer to run at about a rate of 60 frames per second which is about a 16 millisecond delay.
I created a custom class that extends Timer:
public class GameTimer extends Timer{...
and a class that extends TimerTask:
private class GameTimerTask extends TimerTask{...
which has a run method:
public void run(){...
which calls another function which at the moment does nothing except for calling GLCanvas.display() which at the moment also does nothing, except for drawing a single square in the screen.
then:
scheduleAtFixedRate(new GameTimerTask(), 0, 16);
which actually starts the timer.
It worked but then I noticed that running it chews up an entire CPU and very very slowly increases the memory usage.
So now I switched to using a thread and a loop with a sleep:
public class GameTimer extends Thread{....
//The run method
public void run(){
while(run){
canvas.display();
try{
Thread.sleep(16);
}
catch(InterruptedException e){
return;
}
}
}
It works fine and running at about 60 fps it on has 2% CPU usage and the memory usage stays at about the same amount.
They both do exactly the same but the timer just hogs the CPU, am I doing something wrong with the timer that’s causing this? or is the Timer just not that fast?
I did leave out some code to keep it as short as possible, but the basics are there.