profiling rules!

I’d like you to hear a small success story on profiling. I am currently working on phys2d, see this topic, because I need it for a small project of mine called towship. From the very start I did think about efficiency in a bigger picture, for example testing all edges of both polygons would be very expensive (n*m) so I first select collision candidates (still quite some room for improvement there). But I didn’t care much about smaller efficiencies like avoiding square roots wherever possible and those kind of things.

I decided to use a profiler to find the pieces of code where optimizations were really needed although I thought I knew more or less which parts would be problematic. Well, I couldn’t have been more wrong. The first bit of very expensive code was the line-line intersection code, which I implemented with 4 crossproducts (where I only calculated the Z axis!). Rewriting those 10 lines of code immediately doubled the frame rate (from 1 to 2 fps in the profiler :P).

The next bit of code was even more unexpected. I used a HashSet of integer which also was quite expensive, apparently it has a lot of overhead. I tried several alternatives, first I found a specialized IntegerSet library which was much faster but has some licensing issues (I wanted to include the source code itself). So I moved on and tried the java BitSet (although I was worried about the memory overhead in that case) which was faster than the HashSet but a little slower than the IntegerSet. Finally I tried a very simple handcoded singly linked list. It worked like a charm and was the fastest solution thusfar!

So, in roughly two hours work and only about 200 lines of code rewritten/changed, I increased performance by 400%! :o

Next time I won’t hesitate a second to use a profiler and if any of you have doubts … well … just change your mind!

Reminds me of what Jeff said, that bottlenecks are almost never where you think they will be.

Well but often profilers tell a lot of garbage. I just used Netbean’s profiler the last year and what I’ve seen is that the user has to take the results with a grain of salt. At least on my PCs Netbeans seems to over-estimate the importance of small, often called methods.
It told me that most of the time was spent in encryption code which is … simply not true because this thing can decode 40mb/s whereas my app has not more than ~1mb/s throughput because its limited by other factors. Just because this method was called millions of times does not mean its expensive ^^

Short: Netbeans seems to over-estimate the importance of often called, small, cheap methods. They aren’t as expensive as it tells you :wink:

lg Clemens

It injects code into every method, small methods suffer from this much more than large methods.

I really love the ‘embedded’ -Xprof profiler because it doesn’t seem to have this flaw.

If you set your profiler to sampling instead of code instrumentation, you shouldn’t have that problem.