Programming, at its core, is the willful manipulation of metaphor. This may sound, perhaps, like a lesson more appropriate for an English literature class than a column on the nature of coding, but that statement nonetheless not only describes the sum of software evolution for the last fifty years but likely also describes the arc of computing over the next fifty. Metaphors are lazy tricksters, convenient mnemonics that become new realities as people forget the reason for the mnemonics in the first place. The shifting of magnetic fields within transistors become tokenized as numeric codes, which in turn receive the first level of nomenclatures as short instructions of assembler code.
Yet assembler by itself does not occur in isolation — patterns emerge, and those patterns can be named, and codified in turn, providing the second level of abstraction. We build parsers to convert these abstractions into the appropriate characters, and the parsers then define languages such as C … but only when the systems become fast enough and efficient enough that the compilation process makes sense. Lines of code form patterns which get resolved as functions, and functional programming in turn creates libraries of code that pave the way towards the first level of object-orientedness. Languages such as C++ emerge, and Java, and in time others as well. Yet even here the levels of abstraction begin to fail when the complexity of the frameworks becomes too large, too all encompassing for any one person to ever completely articulate.
Abstraction involves two facets - formal codification of the patterns that emerge in a language, and a drive towards simplification so that you have a minimal number of patterns forming the next level of abstraction. The initial driving principle behind the development of nearly all languages produced in the last half century has been to create something which is simpler than what proceeded it. Declarative programming emerged from the document world and from the academic realms where graduate students worked to create their own abstractions based not upon the verb “to do” but upon the verb “to be”. Action is always the easier solution in the short run, but action without structure only creates chaos. It is perhaps not surprising that every major change in imperative programming ultimately came about due to the introduction of some declarative formalism into imperative languages.
The rise of XML has not occurred in a vacuum. It is declarative formalism at its most fundamental, a language for the statement of state itself. It is a mechanism for abstraction that is not all that slowly changing the nature of most imperative programming yet again. The future looks increasingly XML-like … whether expressed in the pointy-angled brackets of formal XML, articulated as the lightweight object definition of JSON objects (which nonetheless satisfies, with one fairly minor exception, all of the requirements of an XML Infoset), and the languages to manipulate this XML oddly enough are themselves weakly typed, with type in fact more often an arbitrary assignment overlaid on a given declarative structure as an afterthought than an integral part of the language. Yet even here the dance between the order of the declarative and the chaos of the imperative continue … like an L.E. Modessit novel, the true wizards are the ones that realize that both are necessary and most balance one another, or death - from stasis or entropy, it makes no difference - will ultimately occur.
During periods of transition, the new abstraction is inevitably used first as yet another mechanism for performing the previous level of abstraction’s activities. This should not be unexpected … the human psyche is designed to repeat patterns that work until it becomes obvious that they don’t, and only then will that mind begin to explore the edges of the new metaphorical web. What we are seeing with Web 2.0 satisfies this observation well - we broke with the past by breaking, first tentatively then with increasing impunity, the dictum that the client should be dumb, should refresh its whole state at once. This was the vision of the web circa 1995, yet old habits die hard - twelve years later, the bulk of web sites are still monolithic, and the AJAX “movement” is rapidly become a sea of competing “standards” that are mutually incompatible with one another even as those who are most vested in their web presences sit on the sidelines and scratch their heads at the carnage.
During the late 1990s, the pundits coined the phrase “paradigm shift”, perhaps attempting to push the metaphor that each new such “paradigm” could be achieved simply by moving the stick shift up a gear. In reality, paradigm shifts seldom occur without pain and bloodshed, without losers as well as winners (and usually far more of the former). There is seldom a definitive point that marks the beginning or end of such shifts, even though the pundits and digiterati will attempt to create their heroes and villains, the red-letter day that kicked things off. Technological change typically takes place in a dozen or a hundred places simultaneously, as the level of energy in the abstraction reaches the edge of the potential well, and jumps to a new quantum level.
Indeed, quantum mechanics and technological innovation share much in common, yet another metaphor that can be carried quite a ways before it breaks down. Transitions are not smooth - both must overcome potential barriers of energy (money, ideas, communication, caffeine) and until such barriers are breached a seeming stasis occurs. The more energy in the system, the smaller the next quantum step, the less consequences of the change. In time, emergent phenomena occur, the electrons leave the confines of the atomic wells entirely and become a free-flowing plasma, with behaviors that seem non-sensical at the lower levels, but make consistent sense at the upper ones - the level of abstraction has changed.
It is said that programming (or techology in general - the same was true in the age of radio and the age of the telegraph) is the province of the young. This is largely correct, but not because the young are better able to handle the rigors of all-nighters. The young, in general, have grown up with newer metaphors, and as such have a smaller distance to jump as the paradigm shifts around them. As they get older, their metaphors become increasingly dated, increasingly tied to the cooling, solidifying layers of infrastructure upon which the next generation build their piece of the cathedral. Occasionally, there is a last hurrah, a Y2K that brings out the builders lower down to sheer up the structure, to fix problems that were apparent only long after they had ceased their labors.
Perhaps this is why we tend to venerate elegance as we get older; whether in bridges or server software, the elegant solutions are the ones that last, the ones that stand most effectively under the stresses of that which is built on top of them. Too many major problems that exist in today’s infrastructures are there because elegance was replaced by expediency. This is why I suspect in the long run the open source infrastructure will end up becoming dominant - such projects typically have not had the pressures of quarterly earning driven deadlines which in turn replace elegance with expedience. In the short run, expediency wins, but the emerging web is increasingly built on elegance. This should say something profound about the relative merits of the two.
This blog originally came from my new personal blog site The Metaphorical Web, a site where I will likely deal more with the metaphorical than with software issues per se, looking at the bigger issues of design and architecture, responsibility and perhaps even art and beauty .. for poetry as well as programming emerges from metaphor, and when looked at in the right light, sometimes its hard to tell one from the other.