Weirdest Practical Programming Language

Weirdest ones I’ve actually tried in order of weirdness.

  1. APL - If you type a list of random characters, including punctuation, there’s a fair chance it will compile.
  2. Forth - Everything is in Reverse Polish Notation and programmed in fixed length frames.
  3. LISP - Some interesting associative programming functions intended for rule based AI (but they are slow)

All of these are pretty old and are from the days when programming languages were in their infancy. APL was even more confusing than using regular expressions, and I didn’t use it for anything actually useful. Forth was better, but the Frame concept (memory was very limited in those days), was very awkward.

If I’m allowed to consider weird CPU architectures, resulting in strange assembler programming, the following are pretty weird, and as a bonus, I’ve actually used these professionally:

  1. TMS570 HET Assembler - 96 bit instruction Risk CPU specialised for counter/timer operations, which is included with a number of Texas Instrument ARM Core based CPUs. There are no data areas, only instructions. Instructions can self modify themselves by design. Every output has a programmable delay, allowing 10nS timing. Hard to code, but capable of some amazing stuff. Well worth the effort.
  2. RCA 1802 - 16 general purpose registers, any of which can be the program counter and the stack pointer. This made glueing together code from different sources, which used different PC and SP conventions really challenging. Very much like the Thunks you had with early 16 bit/32 bit Intel '86 programming. I think the idea was to make ISR’s really efficient, but frankly trying to work out which register was the PC and which was the SP from a code fragment, was hard. You could implement multiple SP’s too for added confusion.

I would imagine the transputer would go here too, but I never got to use it.

Slightly off topic, as these are not programming languages, but formal methods.

  1. Z I successfully used this to prove part of an algorithm. It was a proof by induction. Works well for small contained problems, but gets increasingly unwieldy for large ones.

There are other methods such as B.

Whitespace… WTF :emo:

Yeah, can’t find out why it’s useful. Also, where the values come in?

I was going to say that you can pretty much type up a list all of useful languages which you haven’t been exposed the the syntactic style or major paradigms to answer this question. So I’d say stuff like logical languages. All these “wankery” languages don’t really fall into the category of practical to me.

FORTH: One of the most overlooked languages…it’s actually really nice.


: ( 41 word drop ; immediate
( The previous line defines the comment word, which we can now type like this. Neat, huh? )

Ruby is not weird or bad. Its just totally different, like the designers of the language went out of their way to make it exactly the opposite of C. The language and platform has some powerful features, some of which should only be used by frameworks and not code. For example the ability to actually add and change methods of an INSTANCE of an object. A nice way to make your colleagues go crazy if you use that in your own code :slight_smile:

Javascript pretty much allows the same:


var stuff = new Stuff();
stuff.tick = function() { ... }; // instance-ish
Stuff.prototype.tick = function() { ... }; // class-ish

Ruby is just an awful abomination. I seem to recall even the original creator of Ruby didn’t take it seriously.

Cas :slight_smile:

I tried to learn Ruby (at least passing codeschool’s course) but couldn’t find it useful.

I am watching this thread, because I am searching for my 2nd language so I can become a fully edged CS operator ;D Ruby and JS seems kinda out for me.

There are two completely different reasons to learn a new language:

  1. Because it is (or might be) directly useful to know.
  2. Because it teaches you something new about programming.
    And it’s pretty infrequent that you’ll run across one that fulls both of these categories at a given time.

I have seen, but not used, a computer which implemented SK-combinator calculus in hardware.

Weirdest language I’ve tried, has got to be Gezel. It’s a hardware language, like VDHL, but no one uses it.

I had to implement a CPU in it, for a class at uni. ^^

On a side note, why on earth is PHP a weird language? O.o

Intel released a CPU that does garbage collection in hardware:

you can even run a java app without a main method.

class App{static{println("hello world")}}

I can’t wrap my head around Ada and why it’s so useful. Tasks and rendezvous make no sense to me <.<

FORTH LOVE IF HONK THEN

… truthfully though, I never liked ANS FORTH’s syntax for IF. PostScript’s syntax is much more elegant.

QFT. But it’s usually the case that option 2 eventually has benefits that help you with option 1. Pure functional idioms for example can help you write better code in imperative languages. But it’s definitely a long-term investment with a delayed payoff.

What I constantly hear from functional programmers is, that if you ever start to learn and understand a functional language there is no reason to go back… ;D

[quote=“Regenuluz,post:31,topic:41201”]
PHP, a fractal of bad design

Cas :slight_smile:

That was an interesting read. Some of it I knew already, but that list was looong. xD

And it’s not even possible to really argue against it. Made me consider trying out Python or Perl to make web stuff. :stuck_out_tongue:

Though, the only thing there that makes it a “weird” language is the fact that there’s no clear naming conventions. The rest just make it a crap language. :slight_smile:

Edit:

Didn’t know about the query string thing though. Seems insane to add something like that. :confused: (And by the link he gives, it seems Perl is doing the same)

Edit edit:
Seems like the Perl version is purely a joke xD

Nope, the JVM will crash complaining that it can’t find a main method before it runs any code.