Evolution: On Living Systems in General

Why does life arise – and, in particular, intelligent life? Is it merely a cosmic accident, or an inevitability dictated by fundamental laws? I believe we have the answers to these questions, and they stem from concepts we are already familiar with.

Thermodynamics, information theory, and complex systems theory suggest that life and intelligence are natural outcomes of the universe's evolution and become increasingly probable over time. Humanity's collective intelligence is not an anomaly; our consciousness is deeply embedded in the very logic of the universe's existence, highlighting the profound interconnections among entropy, information, and semantic structures. Furthermore, the intelligence that may succeed ours – whose foundations we are so persistently building – is just as inevitable a stage...

Let's examine this in greater detail from the standpoint of modern science. I should note that much of what has been presented is fairly intuitive and does not necessarily require reference to scientific theories. Nonetheless, scientific approaches introduce additional formal rigor.

Living Systems and Entropy

Regarding the fundamental drivers behind the emergence of life in general and intelligent life in particular, I share the position of Jeremy England, who proposed a theory connecting the emergence of living systems to the laws of thermodynamics – specifically to energy dissipation and the resulting increase in entropy [1]. He mathematically demonstrated that any system (for example, a collection of atoms forming molecules) that absorbs energy from an external source (such as light) and dissipates it into the surrounding environment (such as heat), thereby increasing the entropy of its surroundings, will inevitably reorganize itself to maximize the efficiency of this process. The implication is that self-replicating – or living – systems are statistically favored by nature: in essence, the more copies such systems produce, the more efficiently they dissipate energy. Thus, fundamental physical principles steer evolution toward the emergence of life. As England himself puts it, "life exists because the law of increasing entropy drives matter to acquire life-like physical properties."

From this, the next step naturally follows: intelligent living systems are even more "advantageous" to nature, as they are vastly more effective at dissipating energy and increasing the entropy of their environment. Indeed, this is the very core of technological progress: to consume more and more energy from external sources to perform work, which inevitably results in the transformation of that energy into heat. Agriculture, industrial production, and information technologies all serve as powerful catalysts accelerating the growth of entropy on our planet.

Here, by the way, we should clearly define the connection between England's theory and the second law of thermodynamics. The law states: entropy monotonically increases in a closed system, continually driving it toward thermal equilibrium. England's theory, by contrast, describes how open subsystems (living organisms) within a closed system (the organisms plus their environment) facilitate the fulfillment of this law. Within these subsystems, local entropy can decrease – order, essential for their functioning, emerges – but by dissipating energy into the surroundings, the entropy of the environment increases sufficiently to overcompensate. A good analogy is a refrigerator: it lowers the entropy inside (cooling the food, increasing order), but it ejects even more heat outside, raising the overall entropy. Essentially, England's theory quantitatively describes how open subsystems can evolve by directing energy flows, accelerating entropy production, and, in doing so, becoming statistically probable.

Now, let's move from physical entropy to the informational one and examine Melvin Vopson’s hypothesis [2], which he termed the "second law of infodynamics." It states that, while the physical entropy of any system increases over time, its informational entropy (as introduced by Claude Shannon [3]) tends to decrease.

Informational entropy can be interpreted as the amount of "extra," redundant information. According to Vopson's hypothesis, the universe constantly optimizes its content, striving to minimize the data needed to describe itself. This is linked to stability: systems with lower informational entropy are more stable and consequently more likely to persist.

Vopson's idea is not mainstream, but it is taken seriously by the scientific community. It resonates with me – all the more so because its mathematical foundation, as in the case of England's theory, is quite robust. If the second law of infodynamics proves valid, then it adds another argument supporting the view that the universe statistically gravitates toward viable, intelligent life forms. Our intelligence is a highly potent mechanism for compressing information through abstraction and generalization, which is, in fact, the essence of cultural and scientific development. We continually absorb high-entropy data and transform it into structured, meaningful, low-entropy knowledge.

Combining both "entropic" concepts, we observe that:

- Nature favors structures that maximize thermodynamic entropy (by dissipating energy);

- Nature also favors structures that minimize informational entropy (via information compression).

This points to the statistical bias toward evolutionary paths that lead to the emergence of intelligent life and the expansion of its cognitive capabilities. Intelligent systems uniquely fulfill both roles – amplifying energy flow while reducing informational disorder. The Universe "wants" us, intelligent beings, to arise and develop. It "wants" us to grow smarter and to transcend ourselves – technologically and culturally. And it drives us to spare no effort in creating new forms that intellectually surpass us – artificial intelligences, hybrid entities, post-biological systems, and so on.

[1] England J. (2013). “Statistical physics of self-replication.” J. Chem. Phys. 28; 139 (12): 121923.

[2] Vopson M., Lepadatu S. (2022). “Second law of information dynamics.” AIP Advances; 12 (7): 075310

[3] Shannon C. (1948). “A Mathematical Theory of Communication.” Bell System Technical Journal, vol. 27, pp. 379–423, 623–656.


Living Systems and "Variational Energy"

The theories described above address why living and, especially, intelligent systems are bound to emerge in the universe. Now let’s consider the underlying principle that governs their further development and ever-increasing complexity – that is, how they prolong their existence and more effectively realize their evolutionary potential. In my view, the most universal explanatory framework was proposed by Karl Friston – it is his "Free Energy Principle" [4,5].

The FEP explains the behavior of organisms through the minimization of variational free energy. "Free energy" here does not refer to a thermodynamic quantity, but rather to a mathematical measure (typically the Kullback-Leibler divergence [6]) that quantifies the discrepancy between an organism's internal model (e.g., how a person represents the reality around them) and the sensory data coming from the external world. In practice, this means a living system continuously updates its internal probabilistic models and takes actions that resolve the mismatch between its own predictions and external signals.

Friston himself put it this way: "life exists because it is able to reduce the gap between expectations and reality." Any living system must remain stable under constantly changing circumstances. To do this, it needs to "guess" what will happen next and test its predictions against experience. If the world doesn't behave as expected, the system adapts its behavior to reestablish stability.

Fundamentally, life is an endless game of "minimizing surprise." An organism builds an internal map of reality, and the more accurate this map, the greater its capacity for survival. Humans, with our consciousness and reason, are the most striking example: we constantly anticipate future events, interpret external signals, and when we get it wrong, we change either our opinion (e.g., we stop believing the rain will end in a minute) or the environment (e.g., we turn on the air conditioner when it gets hot). Our brain is not a passive "info-processor," but a tireless forecaster that learns from its prediction errors or adjusts the world to meet its expectations. This dynamic integrates perception, thought, and behavior into a single feedback loop, ensuring our existence and continuous development.

The connection between the FEP and the theories of England and Vopson is quite obvious. To effectively dissipate energy (thereby increasing thermodynamic entropy), an organism must first acquire that energy. If a system is constantly "surprised," scrambling in search of food, it won't find it – or it will itself become food for a better-adapted competitor. Minimizing Friston's "free energy" (or prediction error) unequivocally leads to the maximization of entropy production.

The same is true for the reduction of info-entropy: an adequate internal model of the world represents correctly compressed information about it. To build such a model, the organism must effectively filter out all the noise and preserve the useful essence of the signal (its meaning and causal connections). In other words, it must derive a low-entropy "digest" from high-entropy data.

Thus, we can say that the FEP describes a universal mechanism (the prediction-action cycle) that allows living systems to pursue the imperatives set by nature. Internally (for example, within the human brain), minimizing mismatches optimizes probability distributions, reducing Shannon entropy. Externally, the system's actions "burn" acquired energy, increasing physical entropy. The "entropic" theories explain the evolutionary pull towards intelligent systems, and the FEP reveals how intelligent systems align with this pull.

[4] Friston K, Kilner J, Harrison L. (2006). “A free energy principle for the brain.” J Physiol Paris, Jul-Sep;100(1-3):70-87.

[5] Holmes J. (2022). “Friston's free energy principle: new life for psychoanalysis?” B J Psych Bull. 2022 Jun;46(3):164-168.

[6] Kullback S., Leibler R.A. (1951). "On Information and Sufficiency." Annals of Mathematical Statistics, 22(1), pp. 79-86.