Complexity Miscellany for Beginners: Part II
“The head of Matter did that which the energy-beings could do no longer and it wept for all humanity, and for the fragile beauty of the bodies they had once given up, a trillion years ago.” —Isaac Asimov.
Unlike physics, biology is not concerned with quantifying objects, but rather with describing processes. Although it is possible to define this field of research as the study of individuality, the central paradigm for biologists is not to define life as such, but to study its diverse consequences and varieties. Some theorists such as David Krakauer seek to axiomatize this science, while others such as W. Brian Arthur focus on bringing the methodologies of this discipline to other branches of knowledge, such as economics.
Autopoiesis
However, in 1973 Chilean biologists Humberto Maturana and Francisco Varela proposed a neologism whose main objective was to define the chemistry of self-maintenance in living cells: autopoiesis. In their work “De Máquinas y Seres Vivos” they explain that autopoiesis is the capacity of a system to maintain structural stability by absorbing energy from the environment or continuously self-regulating, despite not being in strict thermodynamic equilibrium.
Note that the definition of autopoiesis is not reduced to living beings, in fact, autopoietic entities in general are those capable of maintaining their autonomy and a continuity in their patterns. This is not so new, when we describe the feedback process in galactic formation we are describing a kind of autopoietic process. This concept had a great impact on general systems theory, an example of which is that the German sociologist Niklas Luhmann extended the concept of autopoiesis to characterize societies as self-organized structures.
Despite the novelty of this concept, autopoiesis does not as such explain the origin of life. In 1952, Alan Turing published “The chemical basis of morphogenesis”, a paper explaining how the symmetry of an initially uniform chemical mixture could be spontaneously broken by the diffusion of various substances through the mixture. At first glance, Turing’s proposal seems counterintuitive. The key to resolving the contradiction of a diffusive process being able to create patterns rather than destroy them is catalysis.
Turing deduced that certain patterns can emerge in a mixture of two chemicals A and B if catalyst A not only promoted the production of more A, but also that of another compound B, which was an inhibitor whose effect was to slow down the rate at which A was produced. It was not until 1968 that this simple mechanism described by Turing gave rise to the characterization of the BZ reaction; this oscillating chemical reaction is today one of the most classic examples to explain the order behind chaos.
The greatest triumph that Turing’s mechanism has had concerns the way in which markings such as stripes and spots develop on the skin or fur of mammals and, more generally, the way in which certain patterns form on the surface of other animals such as fish or insects. If you are interested in studying Turing patterns, you can peruse the work of Maini and Murray published in 1988.
Criticality
During the same era, physicist Per Bak became interested in the complex behavior of systems that were on the verge of chaos. Bak found that power laws and pink noise are always associated with large structures that are composed of many parts. He also realized how important it was for processes to be open, i.e. to have a supply of energy from the outside. This is essential for the existence of a state that came to be called self-organized criticality.
We have alluded to this abstraction before and to continue our journey through the landscapes of complexity, we will illustrate this concept once and for all by making use of an analogy that may cost you a good scolding. Go for the nearest salt shaker and pour half of the container on a clear table, the spilled salt should form a small mountain full of slopes and conglomerations. Now take some salt with your fingers and gradually spread the grains over the top of that salty mountain.
The salt continues to pile up in a heap until the slope of the pile reaches a critical value, if we keep adding grains a landslide will occur, thus the heap collapses. If you have not yet been discovered by your roommates (if any), you can use the remaining part of the salt to continue the addition process; the average amount of salt in the pile remains the same, as the amount of salt collapsed and the amount of salt added are equal.
In this case we say that the configuration is in a state of self-organized criticality, because the addition of just one grain of salt can trigger an avalanche and the system continues to feed on the flux provided by new grains of salt that are dropped, always staying close to the critical point. Whatever the origins of life, theories involving networks and self-organized criticality provide powerful new insights into how living structures function once they have arisen. So what does self-organized criticality look like in a real biological scenario?
Interdependence
Stuart Kauffman’s work in 1980 exemplifies much of what we have just stated. He was interested in how the machinery inside the cell worked. Today we know that all the relevant information of the cell lies in the genetic code, the instructions for making proteins and then giving them a function are found in the DNA, whose primordial unit is four types of amino acids. Adenine (A), thymine (T), cytosine (C) and guanine (G) form huge chains that give erudition to the cell, combining these four letters we can manufacture any entity there is; the source code of nature is written in base 4!
The human genome project has shown that between 30,000 and 100,000 genes are required to specify what a human creature should be. All of them are present in the DNA of any cell in the human body, but not all of them are active at the same time. However, cells specialize: in fully developed homo sapiens sapiens there are over 1200 different types of specialized cells.
Genes are not independent of each other; activating or deactivating one gene can affect the state of other genes, which is the biggest headache for genetic engineering. We can imagine the cellular machinery as a network, where each gene is a node and the connections between them link the interactions they have with each other, as the number of genes involved in the network ranges around 20 thousand, so this problem is difficult even with the help of a computer.
Kauffman uses an analogy in which the nodes of this intricate directed network are light bulbs, resembling Christmas lights. The bulbs can be on or off and each can acquire two states. If there are a total of $N$ bulbs in the Christmas series then the number of possible states is $2^N$. If we start the system in some random initial state, where some of the bulbs are on and some are off, the lights will flicker and change according to a particular set of Boolean rules.
In a stroke of luck we will find that the dynamics will stabilize in a repetitive pattern, following an endless pattern, always running the same cycle from one state to another. Recursive states are called state cycles and act as attractors in the system, in some cases they are so powerful that regardless of the initial state in the configuration, sooner or later we reach recursion.
Kauffman and his colleagues showed that the only entities that behave in a sufficiently complicated way to be interesting, and necessarily stable enough to understand, are those that had exactly two connections for each node. In this type of structure each cycle has a length equal to the square root of the number of nodes, which led Kauffman to follow the subsequent reasoning: “If there are around 20,000 genes in the human genome, and there are over 1200 different cell types in the human body, could it be that each cell type represents a particular cycle of states in the human genome?”
The answer is yes, and in fact, Kauffman showed that there is a power law with exponent $1/2$ between the amount of DNA in a species and the number of different cell types that organisms can have. Thus, subjugated by a chemical machinery, in each cell type a minority of the genes blink, turning genes on and off in a sequence, forming a state cycle whose length is a few hundred steps; the above completely determines the function of each cell.
For Kauffman, living organisms emerged as a network of connections between the chemical substances contained in the primordial broth is the transition of phases in an autocatalytic reaction that have sustained, sustains and will sustain itself. If the network is insufficiently connected, there are no living beings; but it is enough to add one or two more connections for the vital to become not only possible, but inevitable. Like that pile of salt, life is also a system with self-organized criticality, at the edge of chaos.
After all of the above, we hope that it is clear to you that the complexity of living beings is based on a profound and elegant simplicity. To conclude the biological part of our trilogy, we will approach the concept of evolution, since it will allow us to conceive that evolutionary processes also present criticality.
Evolution
Evolution is a fact. Just as the law of universal gravitation developed by Isaac Newton during the second half of the 17th century explains how the planets describe elliptical orbits, the theory of natural selection fabricated by Charles Darwin in the second half of the 19th century is nothing more than a mechanism that offers an explanation of how evolution occurs.
The essence of Darwinian theory can be summarized in three steps: first, characteristics are inherited from one generation to the next. Secondly, the process that copies the characteristics from one generation to transmit them to the next is not perfect, so there are slight variations among individuals in each generation. Finally, more individuals are born in each generation than survive to become adults and reproduce.
We can summarize the above in a single statement: the best adapted individuals are those that survive and reproduce to transmit their characteristics to the next generation; this process will adapt the members of the species to fit even better in their ecological niches. However, we must take into account that species do not evolve in isolation, they must compete with other species to better adapt to their environment; when any of the species involved in this chain of interaction becomes extinct or multiplies exponentially, it directly or indirectly affects the dynamics of other species.
In coevolution, a process in which all species involved in the ecological network undergo changes when one of them changes, each species acts to maximize its own evolutionary fitness. The above naturally drives ecosystems to the edge of chaos, so the ecological network is also an architecture with self-organized criticality. Mass extinctions follow a power law, in other words, the consummation of one or several species occurs without the help of any external intervention.
Under this line of argument, just like the tensions built up in a seismic zone, evolution generates more and more tension in the ecological network until it breaks and the whole network, or a part of it, collapses. That is why several mass extinctions have occurred throughout history, and our species will sooner or later participate in the next one.
Can we do anything about it? Just as there is no way to predict whether the next earthquake that will shake Mexico City will be intense or mild, there is no way to predict in advance whether the next wave of extinctions that will sweep the ecological web will be large or small. Sooner or later the end will come. Now you know that with an extremely wide range of possibilities and scales, whatever conditions we start from and whatever impacts we apply to living things—external, internal or both—we arrive at the self-organized critical state that occurs at the edge of chaos, where even a small trigger can sometimes produce a very broad change within the system as a whole.
