So… I’ve had a fairly long look at Swift to see where “the minds” are going with the next decade or so of programming in a reasonably large ecosystem (Apple). Quite apart from all the suspicious fishy smells of lock-in and half-assed justifications about how it fits Apple’s ancient creaky legacy APIs so-that’s-why-they-did-it-this-way, I’ve a few observations about it all.
Firstly, the language appears to have been designed by egos and personal preferences rather than some modicum of analysis as to what sort of problems occur in the wild. So we’ve got the genius of the “…” and “…” operators that with a single hard-to-spot character immediately introduce a whole class of tricky little off-by-one bugs. Similarly we’ve got the == and === operators, bane of Javascript but behaving in almost completely different ways to any other C-like language. Which is fine if you’re going to be a Swift programmer your entire life, but most of us won’t be, and we’ll be coming from other C-like languages and apart from the subtle redefinition of == the difference between == and === is, again, a single character with a radically different meaning and no other indication that something might be amiss, leading to another whole class of subtle runtime errors. There are various strangely thought out decisions as well such as making rather a lot of twiddly syntax bits that we’re otherwise all used to such as brackets after an if() and the requirement to terminate statements with a semicolon optional. Which is great if it’s annoyed you all these years having to type them, but is going to make for some fairly ugly differences in code style between programmers and lead to pointless arguments. Think about the squabbling we already endure about where {} brackets go and indentation. You get my drift. So a whole raft of small lexical nuances that lead to stress and bugs. Great.
Secondly, there are no exceptions. Just when somebody has finally figured out a great way to handle unexpected things happening in a clean way and provide some actual useful information to developers, the designers of Swift have thrown the entire lot out in favour of… well, it’s every man for himself. This decision baffles me rather a lot. There are, it turns out, programmers who think that exceptions are a bad idea and that they’re “slow” or “cause undefined runtime behaviour”. These programmers are of luddites who like being paid to fix their own mistakes all day long. Hats off to them if they can hold down a job and keep the wool pulled over management’s eyes. It ain’t fooling me.
Thirdly, it uses “automatic reference counting” plus “some secret sorcery to detect strong circular references” in place of actual garbage collection. Again the reasoning appears to be unresearched - not even a cursory look at the state-of-the art found in Java for example. The reasoning goes that GC is too heavyweight for ARM-class devices such as phones. Which of course, is something of a complete joke, as we know; GC is incredibly fast now they know how to do it, and indeed, very lightweight, using very little CPU and relatively little overhead. Instead there’s a half-assed ref counting scheme going on which is acknowledged to break under the circular reference scenario, and so they’ve had to add a bunch of other stuff to detect that occurring. Which sounds like GC, but more prone to trouble. More reasoning behind exactly why this is the way it is is because Apple’s ancient creaky APIs are built upon reference counting and switching to proper GC appears to have freaked them out a bit, so they’re not going to bite the bullet. Anyways… the concept does not appear to be built-in to the actual language spec itself but another hideously grafted afterthought.
Fourthly, there appears to be some fairly shitty inconsistency between what objects are, what references are, how arrays are treated, strings, pass-by-value, pass-by-reference, etc. There is essentially no consistency at all in the language or the data model or the object model. It looks like the designers of PHP have tried to invent a new language. It’s a complete mess.
Fifthly, it’s got some half-assed concept of collection types baked in to the language. All well and good if you’re a newbie programmer but a quick look at the Java collections APIs and you’ll soon come to the conclusion that it’s a massive mistake to attempt to build collections into the language when there are in fact so many different use cases for different sorts of collections that they require a ton of different interfaces and implementations to achieve the correct semantics and behaviours. And this is all built on top of the creaky, inconsistent treatment of objects and references. It has some form of genericism or templating here but again, it’s a half-assed, half-baked, ill-conceived, incomplete feature. The whole thing is a bit of a facepalm.
On the positive front:
It’s got “optional” and not-null baked right into the language, which IMHO is a good thing, and removes one of the most common runtime errors. Yay!
It’s got tuples, though they are immutable for some inexplicable reason.
You can name parameters to method calls and optionally use them.
All in all, the whole thing leaves me perplexed. I cannot honestly think of one good reason that anyone would use Swift… unless they were forced to.
Cas