Most of you, dear readers, are likely technical people, dealing with computers and networks and the Internet. Many programmers tend, when wishing to pad out a resume (and I’m not above this myself) to call themselves system analysts. However, in most cases, these systems are remarkably self contained - the workings of applications and daemons within a computer, a client/server system which might possible have a database backend but is otherwise a closed loop, maybe in the most extreme cases networks of computers where the interaction is seen primarily at the individual nodal level, because trying to take in the whole system is complicated and messy for most linear thinkers.
In short, for the most part what we know of systems where you have many largely autonomous nodes interacting via connections is the linear portions of the graph, those areas where, assuming traffic is relatively low we can reasonably create approximations about behaviors that will be seen within the system.
I have a bachelor’s degree in physics. I’m actually a pretty lousy physicist, however, because I wasn’t really interested, while in college, with what went on in the describable domain of most textbooks. Physics textbooks generally preferred staying in the realm of the linear because once you throw enough bodies into the equations they begin to become unpredictable. Lorenz is best known for “strange attractors” - the quasi-periodic entities that occur when chaos begins to creep into a system, but to me the far more telling graph is the Lorenz bifurcation graph, which shows how a single laminar stream will, at a certain water velocity, split into two streams, then four, then eight, and so forth until there are so many streams that the flow becomes totally turbulent.
It is an oddly beautiful and powerful graph, yet it should also be permanently etched on systems analysts’ eyeballs for what it represents. The horizontal axis of this graph can be thought of as a measure, more or less, of the energy (or load) within a system, while the y axis represents the number of states that need to exist in order to support that load. There are certain aspects of the graph that are fascinating from a mathematical standpoint - the fact that there is a convergent ratio between bifurcation points, the Feigenbaum number (equal to 4.669212…), for instance, is critical in understanding that chaos is not the same thing as randomness, yet the implications from a systems standpoint are just as important.
Before digging into this, I want to make a distinction here …we humans tend to like easily definable systems, and so tend to have a natural prejudice for believing that order is inherently better than chaos. However, in general, systems are usually at their most robust when they are in fact moderately chaotic, but when they are highly ordered, this usually is an indication that the system has largely collapsed. Thus its worth clearing your minds about the preconceptions that we may have about the moral superiority of one over the other.
In the logistics diagram (given because the curve can be defined via the recursive relationship: xn+1 = rxn(1 - xn), which is known as a logistics relationship because it occurs so often in resource management settings) theres an interesting phenomenon - as you pump more and more energy into the system, the number of states increase to a point where they create a cloud of points that seem surprisingly dense - it becomes harder to model the characteristics of the system, because the system can be in so many states.
However, when you pump in a certain critical threshhold of energy, you end up with voids where the number of states end up clustering around only a handful of possible values. It took me a while to realize that this wasn’t a matter of the number of states decreasing — they don’t go away — but the states are all so close in value that for all intents and purposes the bifurcations start over as if from a considerably simpler part of the graph.
Computer networks probably work upon the same principle; as you increase the traffic moving through that network, the systems develop bottlenecks until new nodes are paths are added. So far, this has been a linear process - a simple client/server relationship does require that you throw in more servers as the demand on the networks rise, but in most cases this does not change the underlying characteristics of the network. However, the web has also begin to promote the establishment of standards (whether open or proprietary is largely (though not completely) irrelevant) and open source development that changes the relationship away from simple client/server systems to more networked systems. I realize that there are likely many people who would object to this contention, but I’ll make it nonetheless:
As the load increases on the system of the Internet, the system responds by rewarding structures that shift the load into a more asynchronous, distributed architecture.
Now, this sounds like I’m ascribing to the Internet some level of volition or consciousness. That’s absolutely not the case. Rather, what you see here is that as you move up the logistics diagram, some standard will emerge that is more efficient than the one that preceded it, and the pressures of this network will have a tendency to push other technologies and providers in the direction of this standard. Note that this does not, in fact, say anything about what the exact standard is, though if you can analyse the network at the appropriate scope dominant characteristics that can be found in the most effective standards will in fact be found in whatever is written. The Web doesn’t dictate the standard, but only provides broad hints about what that standard should be.
I definitely see AJAX playing into this model because it replaces tightly bound, synchronous services with loosely coupled, declarative asynchronous services on the client, and because of this also shapes the server (and in time servers) development process. It’s a regime change, and will likely end up becoming a dominant strange attractor until the system pushes this out in favor of even more effective solutions. A website no longer has a single “home”, but instead is made up of components that pull their data from a wide variety of different websites and webservices, running them through a transformation stage on the client to paint the interfaces.
I think that AJAX will eventually fall by the wayside, but it will be replaced by declarative databound objects, possibly along the lines of an XForms or related technology, however, this is simply a refinement of a larger scale movement away from client/server dyads and into multiply connected nexi that just happen to have the characteristics of being displayable (and manipulatable) by people. This will change the “energy” of the web as well.
Jumping away from AJAX for a bit and returning to the weather, the Earth’s weather system is just that - it is a complex system that shuttles around energy. In the US, especially among certain political factions, global warning is considered to be something of a red herring, something that either 1) doesn’t exist, or 2) will generally have more beneficial effects than harmful ones. There is an argument against global warming that seems rational on the surface - how can you tell whether what is going on now is in fact not within the normal bounds of variability, to the extent that people tend to cry about global warming for any kind of unusual weather. Indeed, in some places, such as northern Europe and Russia, far from global warning there seems to be some of the coldest temperatures to hit in years this winter.
On a local level, this is certainly true - determining what is variable weather and what is climate change is one of the thornier conundrums that climatologists have had to face for years. However, there are indications that the changes that have occurred, some of the extreme weather of the last decade, are indicative of an increasingly heavy load being placed on the Earth’s system.
There’s a Rube Goldberg characteristic to the weather (Goldberg was a master of chaos theory) that sometimes tends to obscure the pieces at play but ultimately what you are dealing with are feedback systems that are normal quasi-stable, but that are being pushed increasingly into what looks like bifurcation points. Something is causing the global heat content to increase. I’m not going to dive into the cause (for now), but rather look at the effects that this has on the system. Global warming is causing glaciers to melt - from the Andes to the Rockies to the Alps, glaciers are disappearing at an alarming rate. Kilimanjaro’s glaciers, which provide a major source of water for the region, have all but disappeared. This is alarming from a purely local standpoint because it will significantly reduce water in various watersheds, driving up the price of this most basic of resources.
However, it is the melting of the ice caps that may in fact be more worrisome. The edges of the polar ice caps, both north and south, are calving at a significantly higher rate than normal, and glaciers in the region are picking up speed.This speed increase is due to water forming beneath the ice, acting as a form of lubricant to shift the sheets of ice down glacial valleys. Curiously, the depth of polar ice near the poles themselves may actually be increasing, though I have to wonder if this is due to the fact that as the ice warms, it is also expanding. The melt from these ice caps introduced cold fresh water into the normally high salinity waters of the polar seas, especially in the North Atlantic, which has the largest exposure to the Arctic sea.
The heat capacity of water is increased by the addition of salt - this is the reason that salt is used to de-ice roads, which means that the ability of the North Atlantic to absorb heat decreases. Why is this so important? Well, here its worth understanding some basic principles of physics. One such principle is that nature abhors a gradient - if one region of the sea is warm and an adjacent area of the sea is cold, the warm water will flow toward the cold water, which in turn pushes the cold water back toward the warm, especially in the presence of Coriolis forces (from the rotation of the earth). You get “rivers” within the ocean. One of these rivers has long been known - its the Gulf Stream, which comes from the Sargasso Sea and the Caribbean, a natural “bowl” of relatively still water near the equator which consequently absorbs a lot of heat from sunlight.The Gulf Stream carries away some of that heat northwards, dragging along with it a warm jet of air that, among other things, makes it possible for palm trees to exist quite happily on the west coast of Scotland, which is roughly at the latitude of St. Petersburg.
Decrease the heat capacity of the Arctic and North Atlantic Seas, and something bad happens. The North Atlantic sea begins to heat up, decreasing the flow of the Gulf Stream. The cold water that would come back south as part of this process stops moving as well, which means that the Caribbean sea loses its ability to disappate heat, and the temperature of that body of water also begins to rise unchecked.
Take a quick jog over to North Africa. Global warming has also been slowly dessicating this region and creating the Sahara desert. The Sahara has grown more than 10% in the last century. and like other systems out of control, has exceeded the initial feedback mechanisms that kept the desert in check. Deserts are a lot like cancers - up to a certain size they form an equilibrium with their environment, but when that equilibrium breaks, the weather patterns inherent in desertification tend to amplify as the desert gets larger. This manifests itself as a slowly moving cyclonic air mass that is more or less permanently parked over the Sahara, but that periodically gets goosed by local weather conditions into creating sandstorms These sandstorms then form a hot jet that streams out towards the mid-Atlantic. Fortunately, when the weather is cool enough, this jet encounters the wall of air created by the gulf stream and dumps its contents into the mid-Atlantic, with the warm air adding to the tropical jet that keeps northwest Europe warm.
With the gulf stream slowing (down roughly 30% in the last decade), this wall no longer exists. The Saharan stream of hot dry air encounters the mid-Atlantic’s cold, wet air, and cyclones form … big ones. Normally this has only happened in late summer and early autumn, but the storm season is getting longer because the waters are warmer .. the systems thing again. This means that hurricanes are entering the Caribbean with more strength than they did even a decade ago. Typically such hurricanes lose a fair amount of strength when they hit the continental shelf - shallower water contains less energy - but this speed bump is proving to be less of a barrier given both the strength of the storms and the fact that once the hurricanes do get into the Caribbean they’re spoon-fed superhot water (and air) that turbocharges them. This means that storms like Katrina are not abberations - they are increasingly going to be the norm.
Meanwhile, in the Northern Atlantic, the fading gulf stream is having another effect. The gulf stream typically ends up pushing the arctic air so that it stays to the north, spilling out instead into Canada and Siberia. WIthout it, the cold air is instead streaming south into Europe, bringing with it extraordinarily cold temperatures - I heard figures like -40C in Bonn last week, a new record low. Convection currents shift. The Canadian Arctic is warmer than its been in decades, to the extent that there actually is now a Northwest Passage in what had been, until a decade ago, ice-locked seas - and Innuit settlements are collapsing into the melting tundra. Similar problems plague Alaska and Siberia.
The graph is bifurcating, the states are changing. In the South Pacific, you have another heat pump at work that’s being driven by the melting Antarctic and the stagnant regions of the South Pacific. Normally, this heat pump involves the sloshing back and forth of a quarter of the Pacific’s volume as a region of heat, in a cycle known now as the El Nino cycle. This affects us in the Pacific Northwest for two reasons. In the first case, in the La Nina part of the cycle, this pattern drives a cyclonic pattern which sends a stream of hot, wet air towards the N.American west coast from California north. In the El Nino part, The pattern reverses itself, and this area goes into dryer conditions.
Until comparatively recently, this had a cycle of seven years, and the cyclonic activity was relatively mild. Lately, however, this cycle’s been accelerating — the water’s moving faster there, not slower as it is in the Atlantic, because the heat pump is somewhat simpler. This in turn is increasing the amount of energy in the cyclones, in this case resulting in longer, hotter, more intense and wetter jets. It’s also meant that snow is becoming rarer in these parts because the air is warmer (I’m nervous about what happens when we get into the dry part of this cycle).
The interesting this here is the fact that cyclones are indicative of turbulance, and become more common the closer a system moves towards a bifurcation point. The environment is trying to reach a new point of stability but there’s no way to ground out the feedback loops. I’m not sure I buy the “Day After Tomorrow” scenario, but I do think that the situation that we’re in now is essentially a transition between long term climactic stability points - we’re switching from one loop to the other in a strange attractor, and the ride’s likely to get violent.
So if your AJAX systems begin to show signs of incipient frost, run, don’t walk, to the nearest equitorial nation.
Kurt Cagle is an author and software architect, and spends entirely too much time glued to the Weather Channel.