We must now know what our Jay thinks of different languages.
I haven't used all of them, so we might have to suspend the T-shirt rule for this post.
HTML/CSS. Not really a language as computer scientists would use the term, because it isn't Turing-complete
[1]. It's considered a markup language in the sense that it attaches properties to static data in a dynamically invariant way. As I said, Clavius is hand-coded HTML, without CSS. So in the sense that someone can build and maintain a reasonably large web site whose source remains human-readable, HTML/CSS is a clear win.
However, as with all things, people can resist trying to "own" it. Hence one of the shortcomings is the lack of a clear, persistent, enforceable standard for the language, leading to the colossal inability to parse it deterministically across all clients, and the reliance on vendor-specific "extensions" to the language in the grasp for market share. So while it's elegant in its simplicity and utility, it will likely remain hopelessly polluted until it is deprecated and replaced.
CSS (style sheets) arose from the inevitable compromise between graphic designs and the architects of the worldwide web. The original intent was that plain text would be acceptable on the web (as was the norm for the bulletin boards it was designed to replace), but the author could add optional "markup" to suggest how an intelligent client could display it more meaningfully. But within only a few years, this had degenerated into the need to be "pixel-compatible" among Netscape and Internet Explorer. And efforts to interest professional graphic artists in web content production fell flat until the technology allowed them to practice their art. "Well, it might display differently in a different browser," got a universal middle finger from artists. Hence CSS was born in the artistic sense to give content developers more control over the layout engine. But from the computer-science standpoint, CSS also provides the property of inheritable data classes. This offers the clear win of having a single point of truth in a data set: e.g., I'm going to specify the typeface, color, and size of text for things like headings, and subheadings can inherit and use as much of that as they want. If you change your heading color to robin's egg blue, you only have to change it one place.
The syntax and meaning of CSS statements are clear and mostly self-evident, so CSS would be considered a transparent
[2] language. HTML and CSS are meant to go hand-in-hand, so I lumped them together.
Javascript. File under "Probably Should Not Exist." This language was born inside Netscape as the result of internal turmoil over how best to support rich content on the WWW. The company's master plan was to use Java as the language for the "serious" rich-content engine, and the lighter-weight Javascript as the casual method. Hence the name, even though the language bears little if any resemblance to Java and is controlled by completely different groups. Back then, the philosophy was still that everyone should be able to make a basic web page using only simple text editors and hand-written code, even if authoring tools would develop later.
Integrating Java with the web browser proved as practical as integrating an elephant with a Cadillac, so the notoriously buggy and resource-intensive Java never really caught on for rich WWW content. Further, the decision was made early on that Java rich content would be restricted to "applets" and that it would interact with browser internals only through the plug-in architecture. This made it difficult to mix rich and static content, such as is now provided in the Document Object Model. Hence Javascript was lighter, easier to develop in, and better integrated with the page content as the content authors perceived it. Therefore for all its faults, it fits better what authors were looking for. And with much use comes the demand for a fuller feature set, which led to an organic and disorderly expansion of the language.
Today Javascript is aimed largely at making a web page seem more like an interactive desktop program. It's clearly what users want out of the web experience some 20 years after its inception, but the undisciplined approach to expansion exposes a risk. Everyone wants faster cars, but no one wants to wear seat belts. So the problem is no so much with Javascript as a language as it is with the architecture that makes something like Javascript necessary.
Java. One has to carefully distinguish between Java the language and Java the phenomenon. As a language it leaves much to be desired. It was designed when object-oriented programming was considered to be the final evolution of computational paradigms. This would not be so bad if classes weren't the only available aggregation of data and functionality. Everything is a class in Java, even things that shouldn't be. Further Java's object model is crippled. There is no multiple inheritance, leading to notoriously unnavigable class hierarchies that do little more than correct the object model. There is no guaranteed destruction of objects. And polymorphism is poorly supported.
One of the things the Java language gets right is introspection, the ability to programmatically inspect and traverse an anonymous data object. It was one of the first languages to support it and thus one of the first languages to effectively use local/remote transparency
[3]. And in contrast to broad failures of its object model, a finer-grained expression of the inheritance it
does support makes class hierarchies and roles somewhat clearer. One can, for example, define an interface and implementation class.
Java the phenomenon bears more attention. There's good and bad. Among the good is a vast infrastructure of solid commercial code that can be relatively easily deployed to implement a business computing model. And at the other end is the emergence of virtual machines compact enough to run on mobile devices; Java's resource requirements were previously legendary.
Foremost among the bad is the personality cult. Those who use Java a lot often fall under the spell of thinking that because Java is good for some things, it should be used for all things. This has led to some highly inappropriate uses of Java in contexts system programming and to some frankly anti-engineering activity in companies that rely heavily on software: Java evangelists persuade the use of Java where it is clearly counterindicated. Strings in Java are inexcusably mis-implemented. And then you have the chronically broken interoperability promise: Java has had 20 years to become a "write once, run everywhere" language, but has suffered for the same length of time from brittle dependency on its runtime environment.
C++. C++ is three of my favorite languages. As the first object-oriented language intended for commercial use, it suffers from trying to be everything to everyone simply because no one yet knew what they wanted. It got the object model right in the sense that not everything has to be an object. You can write largely unmodified C and have it work also as C++; it's an almost pure superset of its parent language. And it provides a simple inheritance model that includes multiple inheritance.
The problem with any object-oriented language is how strongly to type it. C++ typing is based largely on C typing, and that leads to severe problems in polymorphism that C++ could solve only with the unholy kludge of templates. But in a remarkable turnaround, the Standard Template Library turned that weakness into a strength and gave the language a set of robust built-in data structures. Only with the advent of the STL does it make sense to write an application in C++.
C#. I've only used it for a few toy programs. I see it as a language that tries valiantly to correct many of the flaws in Java. It's a detail, I know, but I consider one of its major strengths to be first-class support for "get" and "set" methods that can have programmatic content. Let's say you represent a mechanical pocket feature as a class. A uniform pocket has a profile, a depth, a draft angle, and a fillet radius. If you're going to machine the pocket out of aluminum, the fillet radius can't be smaller than your smallest round mill bit. If you're going to injection-mold it, the draft angle can't be 90 degrees. C# allows you to assign values to these data members, but allow for code to be called in specific cases to do things like validate the input. In other languages you have to use the data visibility constructs to restrict access to those data members, and the client program has to know what "setter" method to call.
French. The Fortran of natural languages -- old as hell, way too many dialects, and impossible to parse. Further, they seem to have banished all their consonants to Germany. This would be a usable language if the pronunciation didn't overload to so many written forms, all with different grammatical meanings. How many ways can you spell "/uh/"? And how many things can it mean? Also gets minus points for having its own personality cult.
Perl. The way it's used today tempts me to file it under "Probably Should Not Exist," but in fact Perl arose as a good idea. It has just gotten way, way out of hand. In Unix there is a "shell" program that implements the command-line interface. The shell gives you a usable (but somewhat syntactically opaque) Turing-complete function set, so that in addition to typing single commands you can write looping and decision constructs and use variables. Unix also provides a program called
awk, named after the initials of its authors, that provides extensive pattern-matching on text, and implements a rule-based approach: if you recognize this pattern in the input, do this action. Perl attempts to unify those highly useful functions under one roof.
However, to say that Perl has evolved organically from its roots as the "super" shell to something unholy and despicable is the understatement of the year. A large amount of very useful code has been written in Perl. And there is no doubt that it's an extremely powerful language for system programming. But Perl evangelists don't see why their adage, "There's always more than one way to do it," ends up proliferating all those various ways to do it such that no one is really sure what the code does anymore. It's hard to write transparent Perl. Further, the Perl cult has at times lauded code obfuscation, smugly illustrating the most obscure and arcane way to do something as examples of the power of their preferred tool. As I tell my clients: "Don't be clever; be clear." So as a result, "expert" Perl programmers are more apt to write code that no one but they can maintain. This is rarely commercially viable. I've seen companies have to throw out hundreds of thousands of lines of Perl because the developer left and none of the other Perl programmers could figure it out because they typically do things one of the other many ways to do it.
Perl's type system is a disorganized mess. Billed as a weakly-typed language, it instead introduces a new concept of contexts, which is simply an implicit fiasco of type conversions and interpretations that the programmer must learn and obey. And the same identifiers behave differently in different contexts, which are created in part by different prefix symbols: $ for scalars, @ for arrays, and % for association lists. And its storage model is also a disorganized mess. The runtime system handles memory allocation by means of garbage collection, but the introduction of an anonymous reference data type creates ambiguity in the reference counting, leading to huge memory-leak problems in persistent (i.e., long-running) Perl programs. In short, in attempting to avoid the problems of static and strong typing and explicit memory handling, Perl has simply duplicated them in a way that makes it harder to debug.
In case it's becoming obvious, I don't admire language evangelism. I consider it to be counterproductive and childish.
VBA. Never used it. With that said, I've used a few dialects of Basic simply because I studied compiler design under Tom Pittman, a principal contributor to Tiny Basic. It was "mandatory." Once you strip Basic of line numbers, the resulting structured basic is a highly usable language that is among those surviving the test of time. I personally have never found the need to program for anything running a modern Microsoft operating system
[4].
NOTES[1] Computer science pioneer Alan Turing proposed a theoretical model for computation based on a few simple abilities. These abilities can be extended to programming languages, since a language implies an underlying computational model. One of them is the ability to execute different program steps based on the value of stored data. Another is the ability to change the contents of that same stored data. This results in useful constructs such as decision structures and repetition structures which are deemed essential to any non-trivial computation.
[2] Transparency is a desirable property of programming languages and the programs written in them that one can look at them and get a reasonably correct idea of what they're trying to do. For a program to be transparent means that one who knows the syntax of the language can read the code and say, "Oh, I see, it's checking to make sure an account balance has not gone negative." For a language to be transparent means that the syntax and the semantics are closely related and reasonably self-evident.
[3] Local/Remote transparency is the pattern where two programs running on different computers communicate over a network and exchange data, but the details of that communication are hidden at the application logic level. So a program may say, "Create a mechanical pocket feature out of this profile curve and some additional information," and obtain a description of the resulting geometry. But under the hood the program may have packaged up the profile curve and other data, sent to a remote computation server over a network, and obtained the result back over the network. This naturally requires the two programs to agree on the format of exchanged data. Programming languages that allow introspection of data objects can write a general function to traverse any object and render it in a form suitable for network transport. Non-introspective languages such as C++ implement local/remote transparency only with great difficulty; either the data objects have to be specified in a meta-source, or tedious marshalling and unmarshalling functions have to be written manually for each object.
[4] Surprisingly, DOS and Windows 3.x live on as embedded operating systems, even in life-safe applications. Star Trek: The Experience used Windows 3.1 on its motion-base controller. Science and engineering have moved in large measure to Linux, which is also the OS of choice in the Internet infrastructure. The financial market is still based on old IBM operating systems.