retainAll() vs. costum code thingy

Right, so at some point I found this piece of code, in here, to maintain an active list of elements (I currently use this code):


for(Entity e : this.entities) {
	if(e.isActive()) {
		e.update();
		
		if(e.isActive()) {
			this.stillActive.add(e);
		}
	}
}
	
ArrayList<Entity> tmp = this.entities;
this.entities = this.stillActive;
this.stillActive = tmp;
this.stillActive.clear();

The stuff that happens in the for loop is pretty straight forward, however I’m not sure I get the last bit, with swapping pointers and what not(At least, that’s what I assume it does, as to switch the elements from stillActive over into entities).

Today, however, I ran into the retainAll() function, that’s defined in the Set interface, which does exactly the same.

So instead of the above code, I could write:

for(Entity e : this.entities) {
	if(e.isActive()) {
		e.update();
		
		if(e.isActive()) {
			this.stillActive.add(e);
		}
	}
}
		
this.entities.retainAll(this.stillActive);
this.stillActive.clear();

which imo is a lot more readable and understandable. My question is, which method is better?

Cheers! :slight_smile:

Actually, it just hit me that retainAll must be a lot slower. Since it compares all elements in A to all elements in B. And B can contain elements that’s not in A to begin with.

However, I’d still like someone to explain how the whole pointer swapping actually works in the first code. Because I’m not quite sure of that. :stuck_out_tongue: (Though I know what pointers are and what they do, I’ve meddled with them in C and C++)

Here’s how I guess it works:

  • First the pointer for entities is copied over into tmp
  • Next the pointer for entities is overwritten with the pointer for stillActive
  • Then the pointer for stillActive is overwritten with the one from tmp (Or the original pointer to entities)
  • Then stillActive is cleared (Along with tmp, I suppose)

… And writing this out, I think I just understood how exactly it works… Assuming it works as I just wrote.

Any confirmations on that? :slight_smile:

It would make things easier when you just drop the global stillActive Collection.
-create a new Collection
-put all active entitys in this new collection
-set the entity collection reference to the new collection
-finish

Some always try to reuse objects, but I think this premature optimization is just evil :slight_smile:
reusing objects is just so error prone, especially if there is a lot of state which needs to get reset.

The garbage collector on Android is not that fast, because of that object reuse can be a valid optimization on this platform.
But remember we are talking about one single object here, so I would argue that this optimization is even on Android useless.

ps: In Java Pointers are called References

just use an for(int i… loop an delete stuff on the fly?
I dont know what stuff you are doing, but its alot faster then your function.


Entity e;
for(int i = this.entities.size() - 1; i >= 0; i-- ) {
   e = this.entities.get(i);
   if(e.isActive()) {
      e.update();
      
      if(!e.isActive()) { 
          this.entities.removeat(i); 
      }
   }
}


Great if you can iterate backwards through your entities, and don’t have too many of them.
The OP has the optimal solution right there, for all platforms and architectures.

Cas :slight_smile:

Awesome! :slight_smile: And my understanding of how it works is right too, right? :stuck_out_tongue:

Could you explain please?
I thought using an iterator was really slow + filling an list with each active element is extra slow.
Why would that solution be optimal?

Ok here goes…

Firstly the absolute fastest thing to do is not use an iterator, this is true: use an ArrayList and scan through with a for loop, it’s about 10% faster even on the server VM on the desktop. So why not? :slight_smile:

Secondly, the way CPUs and memory buses work means that memory that is accessed sequentially takes advantage of the CPU pre-cacheing data that it thinks it will need next. By scanning linearly through memory, as you will be doing like this, you are using memory incredibly efficiently, and you’re only gonna look at it once. Likewise writes.

So what you’re doing is:

  • scan through your list of things sequentially. As absolutely efficient as can possibly be on any architecture.
  • perform whatever operation you normally do on the thing
  • check to see if it’s dead. If it’s not dead, copy it sequentially into a second list. As super efficient as it is possible to be, again.
  • When you’re done, clear the original list, and then swap the references to the lists around, so next time, you’re running against that list you built up. Rinse, repeat. The swap is virtually instantaneous, consisting of swapping 4 bytes with another 4 bytes.

This algorithm a) makes the absolute best use of cache as possible on any architecture b) is as fast at removing every single element in the list in one go as it is removing only one element c) does everything in one pass and d) retains the order of the things in the list (often important). Its only disadvantage is using 2x the memory for the list. Which for a list containing 1 million elements will be a pathetic extra 4mb.

Cas :slight_smile:

So at first you say the function

for(int i = this.entities.size() - 1; i >= 0; i-- ) {

is faster and efficienter.
So, the iterater loop could be replaced by this loop for maximum performance?

If i understood it well, the Arraylist uses an array to store its data.
I understand removing can use loads of resources, because the array needs to be rebuild on every delete.
But with adding values the array, the array needs to be resised on every addition or will this be cashed to?

Don’t count backwards, do it like this:


for (int i = 0; i < entites.size(); i ++) {
}

You need to check entities.size() every iteration in case a new entity is added by one of the entities already in the list. Eg. a player fires a bullet. And going backwards is likely to be wrong for just that sort of thing too. Using an Iterator furthermore will actually crash with a ConcurrentModificationException when you try and spawn the bullet. Bad! Slow! Avoid!

Removal in ArrayLists can get very slow for large arraylists, especially if there are quite a few near the start of the list. Conversely, increasing the size of an ArrayList happens very infrequently and is only about the same cost as a single removal might be. After just a few iterations of your game loop your ArrayLists are likely to be the biggest they will ever need to be, and will remain that size. Or just size them properly ahead of time in the constructor.

Cas :slight_smile:

Thanks, very handy to know.
I didnt really think about the inner structure, but its really clear now.

I’m glad I asked this question, it cleared up a lot of things for me too and it’s always good to know how and why stuff works as it does!

Thanks a bunch for the clear answers, Cas! :slight_smile: (And of course thanks to all you other guys that bothered answering in here! :slight_smile: )

Sweet I am doing it this way and was wondering the same kinda of thing.

Actually…I was doing it almost work for work the way princec described… :persecutioncomplex:

Edit:
I don’t know how much a method call is but couldn’t you have update return whether or not the object/entity is active instead of calling a method? That way it could be

for(Entity e : this.entities) {
   if(e.update()) 
         this.stillActive.add(e);
   }
}

then clear the old list and set the old list to point at the new one and clear the temp one.

You could do that yes, though it sort of pollutes the purpose of your update() method, and you’ll still need isAlive() or isDead() so that entities can check that references they may hold be alive still during their update.

Cas :slight_smile:

Here’s my code, I handle an array myself… this as optimal as it gets I think


Entity[] entities;
int size;

public void update() {
  int alive = 0;
  for ( int i = 0; i < size; i++ ) {
    Entity e = entities[i];
    e.update();
    if ( e.isActive() ) {
      entities[ alive++ ] = e;
    }
  }
  // if you want to clear the references..
  while ( alive < size ) {
      entities[ alive++ ] = null;
  }
}

That’s pretty nifty. I suspect most people including myself would get away with doing something like that if mutation of the entities array wasn’t a problem during iteration (I suspect it isn’t but I’ve always played it safe).

Cas :slight_smile:

Only problem as I see it, is that you need to know exactly how many entities you’ll have at max, at any given time. Or else you’ll need to add code for changing the length of the array. :slight_smile: But that code is a lot easier to comprehend than the other code is. :slight_smile:

And I think I might just implement something like that in my particle emitter, as it has a max number of particles defined. ^^

That’s exactly where I pulled this code from, from my ParticleEffect class =P

Well it wouldn’t hurt to replace the array with an ArrayList but use the same technique.

Cas :slight_smile:

Probably not (given extra method calls will probably get inlined) but it’s also easy enough to expand the array on demand in the way ArrayList does in only a couple of lines!

@ClickerMonkey - nice example. Was initially trying to understand the size field, then realized I think you’re missing setting the size to the number of live objects prior to nulling the dead ones. Won’t this cause a NPE on the next run otherwise?