I have been doing some work lately teaching myself some of the basics of Ruby, Python and a couple of other languages. As I was working with these languages, I was suddenly hit with a question – why? Over the course of my career, I have programmed in a lot of languages (somewhere around 20 that I have actually used to do useful work, I think). I am sure many of you can say the same thing. And do you know what? They all suck in one way or another! I have seen language’s popularity come and go. I have seen arguments in person, in newsgroups, and all over the web which bordered on religious fanaticism. Even as I write this, a good discussion continues in response to The Next Big Language.
Again, I ask myself “why?”
Looking back over projects in which I have participated, led, observed, or otherwise been involved, I cannot think of one where the success of failure (or degree of success – failure is not usually absolute) of the project was due to the limitations of the programming language. Nor has the programming language been the determining factor in the cost of the project, or the quality or the maintainability of the code.
There are so many factors which are accepted to have much greater impact on the course of a project than the choice of language/technology – requirements, architecture, realistic planning and tracking, and proper resourcing to name a few – that I find the whole debate around programming languages to be somewhat meaningless in the real world (actually, I find it more annoying than meaningless).
This is not to say that I do not believe we should always be innovating and inventing new ways of doing things (including programming languages). It does mean, however, that it is highly unlikely that any of these language advancements (or The Next Big Language, whatever it is) will make a significant difference in software development in either a corporate IT or commercial product development world – at least not any time soon.