Nope, this ignores the fact that Exception’s are (should be) nested / inherited / have a tree-structure in the most-frequent case. Yes, there are problems with unchecked exceptions, but it’s not as bad as he implies. Often, a developer who has many catch’s in a row is unskilled/not thinking in java/incompetent (and the first two are not in any way derogatory, just factual. I wouldn’t expect a high-school student to be a skilled systems architect, likewise I don’t expect every java programmer to know how to develop java properly).
NB: I’d fire any coder who propagated “Exception” simply because they were too lazy to catch it (one of the cases he cites). That’s just bad programming. I’ve seen (very rarely) coders who make most/all arguments “Object” (not necessarily in java, but certainly in typed langs) so they don’t have to worry about types; that in itself is not a valid argument for a typeless language.
NB: Equally, the claim that usually you want to propagate exceptions to be handled by your outer environment implies only that the speaker has limited experience of application design - this is entirely true of certain types of apps, but certainly not of all.
IMHO the real problem with java’s checked exceptions only occurs when you are having to catch exceptions that come from different, incompatible, API’s (he highlights this), where the Exceptions from each API haven’t been incorporated into a common type hierarchy (or, less often, where they have, but you want a different hierarchy).
AFAICS, this is really nothing wrong with checked exceptions in particular, it’s a “bug” with java’s brand of OO programming. There are other instances of this kind of problem in java, and all would be neatly solved in one fell swoop if we could use a dynamic type hierarchy (so you could change the types of things at will). It’s a frequent problem also in game-programming where you have very complicated game-logic that needs to be changed frequently - you can’t use the OO inheritance system because it’s too rigid, and far far too hard to modify at a later date.
You can work around this for exception handling (I have in the past) and make life simple - you have to be a bit cunning though, to avoid being screwed by being unable to alter the typing hierarchy, and it produces a piece of ugly insane code (encapsulated in a private class so no-one need ever see it). You often have to do this kind of yucky thing in java when writing 3rd party libs, so it’s nothing new :(.
Having seen the huge differences they can make when deployed appropriately, I can only say that that reflects upon your own coding style rather than anything else. I would hazard your coding style is not suited to java (or vice versa); this may in turn be enforced by your app design, rather than you having the luxury of choosing your style.
I’m going to disagree on whether methods should be virtual by default or not (depends how much OO dev you do, and how you use OO - for instance, 90% or perhaps more of all methods I ever write I’d have to explicitly mark virtual). Wherever the virtual-by-default would be a problem, I generally know at design time, and can use final’s; I suspect I’m largely lucky in that the niche I’m in has classes of which this is generally true, rather than this being generally the case. But there’s certainly cases where virual-by-default is the best option of the two…
There are also often cases where you can’t make something final, but you need to partially restrict overriding, and the best you can do in java is a big in-yer-face javadoc comment (e.g. “don’t override this unless you also …”). I would appreciate a better system than either virtual or non-virtual by default, actually…?
Just my 2 cents worth…