I’ve never really thought about it before now, but as I recently purchased a 6-core Phenom II it dawned on me that javac is making almost no use of these extra resources.
This seems rather archaic, and not at all in-keeping with “a platform that has been designed from the ground up to support concurrent programming.”
I wonder - what do enterprise customers do? Do multi-threaded Java compilers exist?
Surely large code bases & multi-core processors are at the heart of their business - it’d be extraordinary if they have been lumbered with this limitation for so long & done nothing about it?!
A few Googles have turned up some investigation into the issue back in 2008, but it looks like the work was abandoned for one reason or another.
This lack of interest seems very odd to me, as from my rudamentary understanding compilation appears to be eminently suitable for distribution across any number of cores, or for that matter across distributed networks.
This liability is obviously only going to get worse with the inevitable growth of parallel computing, and the subsequent larger, more complex programs that take advantage of these resources.
Are the powers that be simply too afraid to rewrite the compiler for fear of introducing bugs?
