Scenario:
I’m giving the Wikipedia compression contest a try, where you have to compress 100MB of text to the absolute minimum. Not so much to try to win, but see what problems there are in this field.
As you can imagine, the program builds a lot of data-structures, pumps data among them, and analyses patterns in the words. It uses a lot of RAM, mainly in byte[], int[] and objects (no Buffers).
I can run the application a few times, and it crashes in a few kinds of errors. This is worrying, as there is only 1 thread working on the data, so IF there should be errors, it should be the same every time the program is run. Furthermore, the errors don’t make any sense. Like copying from an int[] to an int[] with a for-loop, results in a ArrayStoreException, sometimes a NullPointerException, and a native crash (!) every once in a while, and 10% of the time the program runs just fine.
These errors start to occur when the program starts to swap to the harddisk / heap is completely flooded.
I tried both Java 5.0 and 6.0, same thing.

I am currently sitting at a compressed size of 25MB however this is using only rudimentary frequency statistical models. I am hoping with more complex models i am developing i can get down to 20MB. That is my goal at the moment… still it is better than simply using ZIP (34.7MB )