retainAll() vs. costum code thingy

I think for performance reasons, just using a normal Array fits better

add -> in the ArrayList Class



public boolean add(E e) {
	ensureCapacity(size + 1);  // Increments modCount!!
	elementData[size++] = e;
	return true;
    }
	
public void ensureCapacity(int minCapacity) {
	modCount++;
	int oldCapacity = elementData.length;
	if (minCapacity > oldCapacity) {
	    Object oldData[] = elementData;
	    int newCapacity = (oldCapacity * 3)/2 + 1;
    	    if (newCapacity < minCapacity)
		newCapacity = minCapacity;
            // minCapacity is usually close to size, so this is a win:
            elementData = Arrays.copyOf(elementData, newCapacity);
	}
 }	
 

that one might be quicker


  entities[ alive++ ] = e;

premature optimization guys^^

I would bet that the update method of the entities is much heavier than the loop + the deleting. Besides, how many entities will u have? one, two thousand?

You’re looking at the wrong method - try set(…). Still has some extra overhead though, as it returns the old value.

Well, any “repeated in a large loop every frame” processing is
a prime candidate for optimization.

Especially when it comes to things like particles or other graphical gimmics.

“add” would take over increasing the index counter.
Thats why I assumed it would have been chosen.

Not premature really, this is just a trivial way of doing things that is no more complicated than any other way but just happens to be really easy to understand and use and is the fastest possible way to do it. And let’s not forget when you’re on a 700MHz ARM chip with a 16 bit bus running a crappy Dalvik VM that every cycle is important.

Should probably be noted that if you are dealing with particles in this way you might well have a few thousand of them as well.

Cas :slight_smile:

It isn’t the index counter!

ok size counter.
…whatever is used to determine the next entry

I agree with you, when you handling particles this quite a different story.
What I think a big problem with premature optimization is, is that so much time gets wasted thinking about better ways of doing a simple thing.
And this thread is still going on, even a good enough solution was found^^

You’re still not getting it! :stuck_out_tongue:

This moves live objects earlier in the list over dead ones - it doesn’t append at the end of the list. Obviously, read array for list in the original code.

You’d then clean up at the end by calling (ideally) removeRange() - which for some reason requires subclassing ArrayList ???

That code snippet was from memory, and was a little wrong… here’s something correct with the adding code. I have many other things going on in my update logic but here’s the bare bones:


Entity[] entities = new Entity[32];
int size = 0;

public void add( Entity e ) {
   if ( size >= entities.length ) {
      entities = Arrays.copyOf( entities, size + (size >> 1) ); // + 50%
   }
   entites[ size++ ] = e;
}

public void update( Entity e ) {
   int alive = 0;
   for (int i = 0; i < size; i++) {
      Entity e = entities[i];
      e.update();
      if ( e.isActive() ) {
         entities[ alive++ ] = e;
      }
   }
   while ( size > alive ) {
      entities[ --size ] = null;
   }
  // I also have downsizing logic where once I go down to X% free space I start keeping track of time, 
  // once I hit Y seconds I know I can safely downsize the array. Ideally you should know exactly the 
  // number of entities you want to cap it at so you don't have to resize or downsize.
}

And yes, I’ve optimized this code because I always end up having thousands, often tens of thousands, sometimes a hundred thousand entities being managed at once…