Every year in IT something comes along that is ‘super-cool’, i.e. at least two steps beyond what’s generally considered “bleeding edge”. “Bleeding edge” itself is called that for good reason, and what makes the “super-cool” only accessible to the massively initiated is that it’s generally been developed by some mad genius who doesn’t feel the need to either make the code comprehensible or provide any reasonable documentation. Rather than simply bleeding, for most in IT for whom being in IT is a living, even a personal interest, but not necessarily the entire focus of our work, social and family life, involvement in the “super-cool” involves at minimum a slashed artery or two.
Of course, some of the most mainstream languages, frameworks and concepts used in IT started out that way: Java, when it was still known as Oak; Haskell, when functional programming still made most developers think “C”; Eclipse, when it was still known as “IBM Workbench”, was not open sourced, and the idea of incrementally compiling code in memory gave most developers the cold sweats; MVC, when it was first released from ignominy as part of the Smalltalk ecosystem and used in web frameworks such as Struts (MVC was still giving .NET developers the cold sweats 10 years after that); unit testing frameworks such as the now ubiquitous XUnit, X being whatever language or framework you happen to want to unit test, such as JUnit, JSUnit, Nunit, (come to think of it, that all started in Smalltalk with SUnit as well), etc.
A large number of such “super-cool” ideas never really make it beyond that stage, however. And of course there are those that make it some way past that stage and form a niche that sticks around but never quite becomes mainstream. The latter include things like Ruby/Rails and its successors, super-cool in the early-mid 2000s, niche in the later 2000s, not really considered all that important today. Haskell itself could be considered a ‘mainstream niche’ if you like, much like LISP or PROLOG or Smalltalk itself before it.
So what differentiates Oak/Java, which now has the distinction of having more code written in it than all other programming languages combined, from Ruby/Rails, or even Haskell/LISP/Smalltalk? Is Java really that good?
The answer to the last question is, of course, as any developer familiar with other languages will tell you, a resounding no. I’m not saying its terrible, but I wouldn’t really classify it as better than decent at most tasks. For any specific task, there is probably a more efficient language, but the sheer market presence, in terms of available frameworks, libraries, and most importantly skilled developers, makes choosing something else a difficult and usually unpopular decision. But of course that assumes that it somehow differentiated itself sufficiently in the first place to build that market dominance.
Answering the question of why Java has been so successful requires first answering the questions posed by the lack of similar success of other languages and frameworks. Why doesn’t a given “super-cool” new idea gain sufficient traction to either become a stable niche, or to go further and become mainstream?
Immediately, the bleeding edge nature of most new ideas, which includes things like a lack of decent tools, arcane syntax either in the language, the APIs in the case of frameworks or libraries, configuration files in many languages and frameworks, puts off those who don’t spend all of their non-work social and family time discussing the newest tech invention and figuring out how to use it, and as importantly, what to use it for, if anything.
Earlier in the history of computing above average resource requirements, above average cost if the idea was proprietary, unfamiliar syntax, or lack of availability on the most commonly available machines were often among the biggest barriers to acceptance. Some combination of the above were the reason for the niche status of Smalltalk, Hypercard (had Hypercard been available on PCs, it’s doubtful that HTML would have ever taken off), Objective C, LISP, and other innovative ideas.
Today ideas have the advantage of portable low level compilers that allow them to be made available pretty much everywhere, a developer base accustomed to a wider variation of syntax in language structures, and hardware sufficiently cheap and powerful to make most ideas available on machines that people actually have, and a recognition by companies that they have to build a market before they can make money off a new idea.
So the old issues that plagued things like Hypercard or Objective C are no longer so much of an issue. The issue, then, becomes primarily what can be done with the new idea that said idea makes significantly easier, more efficient, or in some other way noticeably advantageous over using a mainstream technology. As a case in point we can look at Ruby/Rails, which garnered sufficient interest to become a significant niche, but a niche that, from all appearances, looks to be slowly losing adherents year by year.
The strength of Ruby/Rails was simple: if you wanted to create a basic, CRUD based web application without writing a lot of rote code to handle simple things like page navigation, the Rails generator essentially created all that scaffolding code and configuration for you. That was a big advantage at the time over manually writing Struts configuration files or whatever Java framework took your fancy, and made the creation of decently dynamic web applications much cheaper in terms of developer effort. So what prevented it from taking over the mainstream?
Ruby was a new language, and while similar in many respects to Smalltalk (where the Rails concept originated as Seaside) had some issues. In trying to create a “Smalltalk without Smalltalk”, i.e. without the need for a Smalltalk VM to do anything at all, Ruby is not properly virtualized. Plenty of Ruby Gems (prepackaged objects) require a call to a specific C library that may or may not be present on a given platform, making cross platform development more difficult than it should be.
The lack of being in a Ruby “environment” (which would be similar to a Smalltalk environment) meant that the tools were relatively primitive and difficult to develop further. Most of this could be worked around easily enough as long as the generated application did most of what was needed, but it was precisely when trying to extend a Rails app with more complex business logic that the relative lack of libraries, and the lack of proper virtualization in those that did exist, made Ruby apps suddenly increase dramatically in cost.
Going back to Java, of course Java had some similar issues, and plenty more besides, so why did it take off to such a massive degree?
A huge part of the answer was simply timing. Java came along precisely when the limitations of CGI became apparent in creating web applications, but the increased control and accessibility of web applications was pushing a more capable server platform. Although Java as a client programming language has never had more than limited success, as a server platform, with the introduction of web integration frameworks such as JSP, Servlets, Struts, etc. made Java the most attractive platform for web application server development. The addition of J2EE technologies, while many of them were only necessary in very specific niches (EJBs for distributed transactions, for instance), gave Java the capability of going beyond where 99% of applications needed it to go for the moment, but since it supported those niches, while ASP and other competing technologies didn’t, a company that needed niche functionality in one application was likely to choose Java as the platform of choice for all the others. Although plenty was done using ASP and later MS technologies, and plenty was done using PHP and related technologies, neither supported the broadest spectrum of requirements at the largest companies, or did so only much later, by which time Java was too entrenched for .NET or PHP to make major inroads other than in specific use cases that didn’t need significant integration with the already large base of Java server applications in use.
And of course Java has not stood still itself, where limitations were shown to be just that by other languages and frameworks, the Java ecosystem has had the resilience to create something either as good or at least workable, so that a significant advantage from a new idea becomes at most a minor advantage. And the tendency to need to incorporate ideas into the Java ecosystem (for instance, Jython, which is Python that compiles to Java, so it can take advantage of the wealth of Java server features) tends to keep those ideas in niche areas and simultaneously strengthen Java as the mainstream go-to language.
To shift the discussion a little, I’m going to discuss another once “super-cool” new idea, node.js, and what has happened to it in terms of developing towards a mainstream toolkit.