Sure, it’s not bad as a source language, warts like the need for === aside. But it’s still not a very good compiler target, in that it doesn’t easily map back to source lines (having not even anything like #line, let alone a debugger format), it lacks typed operations when the source compiler could make use of them, and that it requires some fairly fancy rules in itself to parse and evaluate.
Compiling straight to bytecode might be too low-level, but javascript just isn’t intended to be a backend for other languages.
You just missed my sarcasm. Taking your reasoning, one could argue that python is better than Java because it supports lists and dictionaries in a more primitive form. With Java, you have to use the collections API that was built with the language. I might as well quit programming now because it’s absurd that you have to define your own functions and types to get what you want.
I’m stretching a little bit, but it’s the same argument you’re making. JavaScript must be terrible because it exposes the tools to support language extensions. Because of this, they didn’t need to explicitly add support, someone else would do it. If JavaScript had a better runtime library like Java, one could imagine that the class support, or JQuery-like support, would be included for those who wished to use it. But you could just add a which is about the same as an import or include, and tada you’re good.
And now it’s my turn to troll. You seem to like common lisp, I find it to be the most difficult language to read, it has terrible support for type definitions, and the need for macros clearly shows that it has flaws in its language model.
Hey, just because one silly crank likes a language doesn’t make it bad. The use of macros doesn’t mean the model is flawed, it means it doesn’t presume its model is the only one that can encompass all programs now and forever. There’s a macro to CPS-transform its forms – how cool is that? It sucks that macros aren’t as first-class as they could be and that they’re not hygienic, but I’m not too keen on scheme’s complicated approach either.
I second Cas on what he said a few posts earlier - the problem is not JavaScript, it is indeed the fact it is the standard for web development these days. But remember that good folks have been releasing us from most of its problems by providing nice libraries, such as jQuery, Prototype, Scriptaculous… which can handle nicely most of what you’d want to do in a web application these days.
I can’t quite think of what strange chain of events must have occurred to force this situation on programmers, really. Since the dawn of computing we’ve used whatever language fits the problem best and whatever we’re most comfortable with, and we’ve also largely been able to create new languages as an when necessary. Then this happens.
I think it’s one of those situations where the market and the availability for the end user weights in. JS was there in the early browsers, and at that time, “web development” was about writing PHP and handling client side validations and navigation with scripts. It was only later that people begun questioning about keeping state in web applications, and how flawed the request/response with pure hypertext was in other to achieve that.
Thing is even ordinary idiots like me saw this coming years ago. So I wonder why the powers that be didn’t settle on some sort of common bytecode engine for scripting and start getting that into all their browsers, allowing people to use any language and compiling down to that? I mean, Java was an obvious candidate, but there have been various others like CLR and LLVM come along in the last decade.
Bad example. It’s very difficult to write specs for that stuff and it’s also very difficult to verify the correctness of a particular implementation.
Something like a VM is a lot easier. People know how to write specs for this kind of thing and you can describe every detail very precisely. Furthermore, you can easily and accurately verify the correctness of an implementation - you can write tests for everything.
Even Microsoft’s JScript thing was pretty accurate and they didn’t have any specs for that. They reverse engineered it. I kid you not.
Runtimes and VMs are really a zillion times more straightforward than document formats, HTML parsing (with it’s million gotchas), or CSS.
That’s a lot of work to do right, and you’ve then got a far more complex ecosystem, which needs far more time to mature.
If netscape had of provided a bytecode engine instead, I think the reality is that Microsoft would have just pushed VBScript more for the web (since it is a client-side language), and everyone would end up using that instead.
Ok, bad example indeed. But the case with JS was not the first time that standards and sanity have been ignored for the sake of quick delivery to market, a (supposed) business model advantage, and ultimately, profit. Yes, designing a spec for a common, standard VM would have been best at the time, but it is easy to understand why it wasn’t.
This not true for a well designed IR (high or low level). The low level IR spec of LLVM can hit tons of final targets and is flexible enough to support a huge spectrum of front-end language features. In my “fantasy” case…a high level, multi-paradigm IR, most people would never want to look at a source code version of it. Pretty much limited to front-end language designers and extreme masochists. It would probably need to look like a massively annotated s-expressions (think LISP) or m-expressions.
Totally agree. From a pure expressiveness standpoint, prototype based are significantly more powerful then OO. They were developed because the strict class hierarchies of OO is a big ugly chain on design.
Lots of languages have separate comparators for “is equal to” == and “is identical to” ===. Not supporting both would be officially bad.
So, where does C++ fit into this revisionist version of CS history?
That’s because: “Standards are great! Everyone should have one.” always seems to rule the day.
I don’t see any huge flaws in the JS specification. But to carry with the theme of the comment, this isn’t limited to direct/indirect profit motives. In fact it’s better to get out an imperfect product in a timely manner than spending an excessive amount of time/resources attempting to create a perfect one…which is a forever moving forward target. To turn the microscope on Java: The bytecode spec is “designed” to be interpreted and makes no nods to the needs of a back-end compiler (ouch). The original bytecode verification code was a complete snake pit. At the high level, Java isn’t a pure OO language, so to my mind it would be completely obvious that mutable primitive wrapper classes would have been a part of version 0.01 and they were not added until…wait! They still aren’t there! Some very clever folks with experience creating languages worked on all of this, but ultimately time constraints have to weigh in and you can get everything perfect…you’re lucky if you can hit pretty good.