The Continuum of Abstraction
Universal at Both Ends
Collisions between rocks are governed by the universal, inexorable laws of physics. Collisions between wolves and moose are governed by specific local constraints that result from the historical process of natural selection. Newton’s laws are universal; teeth and tendons are not. A quite different form of universality is found in the algorithmic rules of the Turing Machine, which are sufficiently universal to compute any function that can be computed.
How can it be that physics and computation are both universal, when they are so clearly different? “The problem and the attraction of physics and computation as bases for models is that they are both universal, but complementary, modes of description,” writes Howard Pattee. “Everything must obey physical laws, we assume, even if our descriptions of these laws always have some inaccuracies. Computers are universal and conventional; everything can be described by a computer convention if it can be described by any other convention” (emphasis his).1
The connection between the two is as old as computation itself. One of the field’s pioneers, Ada Lovelace, notes that “in enabling mechanism to combine together general symbols in successions of unlimited variety and extent, a uniting link is established between the operations of matter and the abstract mental processes.”2 This was in 1843.
Physics is the quintessence of rate dependence and computation of rate independence, physics of the concrete and computation of the abstract. Imagine, then, a continuum of abstraction anchored at each extreme by one of these universal models, physics at one end and computation at the other or, as Lovelace would say, at one end the operations of matter and at the other the abstract mental processes.
Arrayed between the universal extremes on the continuum are the particulars of systems of sequences—Lovelace’s general symbols in successions of unlimited variety and extent—as they have evolved in the living world and human civilization. While at one end we find the pure rate dependence of physics and at the other the pure rate independence of computation, at intermediate points rate dependence and rate independence are more or less entangled. The more entangled they are, the more they resemble dynamic physical systems. Conversely, as they become more abstract, their rate-dependent and rate-independent elements become easier to distinguish, with the rate-independent element more closely resembling computation.3
Along the continuum between physics and computation reside three broad groupings: allosteric constraints, entangled sequences, and control hierarchies, with allosteric constraints near the physics terminus and control hierarchies near the computation terminus. Traveling from left to right, we see rate dependence slowly giving way to rate independence, Michael Tomasello’s drift to the arbitrary,4
Leftmost is physics, governed by laws which are universal, inexorable, and incorporeal, and which exhibit behaviors that are rate-dependent. We accept the laws with the understanding that, in this universe at least, they apply everywhere and cannot be other than what they are. Strip away every arbitrary constraint in the living world, every element of what we would recognize as improbable but coherent behavior, and you are left with good old physics, the foundation in which all systems of sequences are grounded.
Moving to the right, allosteric constraints include things like traffic lights, alarm calls of monkeys, trail pheromones of ants, icons used in signage, gestures like pointing, and regulatory molecules that bind to allosteric enzymes. These are not linear patterns, but they nonetheless behave as specific constraints within— and only within—a larger system of sequences. The function of an allosteric constraint is to configure an interactor, to get the monkey to evade a predator or the ant to follow a different path, to redirect the gaze to a particular affordance, to activate or repress an enzyme’s function. Allosteric constraints can only configure interactors, and only reversibly; they cannot construct or replicate them. Thus, they rely on a population of prefabricated interactors.
Examined in isolation, allosteric constraints can be explained in physical terms. Out of context, a regulatory molecule binding to an allosteric enzyme looks like ordinary chemistry. In the absence of drivers to be constrained, a traffic light is just electromagnetic radiation. In the absence of ants to be diverted, a pheromone is just a molecule. “A molecule becomes a message,” says Pattee, ‘‘only in the context of a larger system of physical constraints which I have called a ‘language’ in analogy to our normal usage of the concept of message” (emphasis his).5
Moving another step to the right on the continuum we find entangled sequences, in which there is no clear division of labor between dynamic interaction and sequential storage. The rate-dependent and rate-independent elements
FIGURE 9. 1
cannot always be readily distinguished. Two examples of entangled systems are the RNA world and preliterate human culture. The ribozyme can interact and replicate, but it cannot do both at the same time. In human speech, sequential information is accompanied by dynamic elements of paralanguage and gesture.
Neither system supports scalable random access, which requires stable, longterm storage of one-dimensional patterns. And neither can generate orders-of- magnitude improvement in either the precision of pattern recognition or the power of catalytic manipulation. Systems of entangled sequences have evolved a limited degree of abstraction, but their rate independence is always infused with some rate-dependent dynamics. This limits how large, coherent, and complex they can become. Immediately to their right, John von Neumann’s threshold of complication remains uncrossed.
To cross the threshold, we need control hierarchies in which the rate-independent and rate-dependent elements are decisively separated. As with cells and our literate technological civilization, the storage/replication element has evolved to be independent of the interactive/functional element. However, even these systems fail to completely escape entanglement with their physical mechanisms. They depend on lots of improbable equipment, like ribosomes, spliceosomes, and human animals, as well as ineffable processes like protein and chromosome folding and whatever goes on in the brain.
The division of labor between the rate-dependent and rate-independent allows these systems to cross von Neumann’s threshold, making them capable of open-ended evolutionary creativity, Lovelace’s unlimited variety and extent. They can scale up and diversify, and there is no upper limit to their complexity; further reclassification is always possible. Here we get our first glimpse of Francis Crick’s Central Dogma: the division of labor between replication and interaction entails the one-way constraint of interactors by sequences.
When we arrive at the right-hand pole, we find the formal sequence systems of computation. Defined relationships within and among the rate-independent linear patterns comprise the entire system; there is no entanglement. Sequences are manipulated and rewritten according to formal rules, which are sequences themselves. Meaning is not a system requirement. Algebra students can solve for x without knowing what A' stands for.
Semantics in computation is deliberate, in L. S. Vygotsky’s sense; if there is to be any meaning, it must be assigned from outside the system.6 And unlike control hierarchies, which require an elaborate and improbable hardware platform in order to operate, the sequence processing mechanism for formal sequences is arbitrary. There are many ways to instantiate a Turing machine.
Like every model, the continuum is an oversimplification. Nonetheless, it provides a quick way to envision the evolution of complexity. At one end is symbolic information, at the other end physical systems, and arrayed between them are the mechanisms that have emerged to allow the former to actually get control of the latter.