Alexis Krywula Joannidis

Entropy, Biological Computers

It seems she is, as we say, bugged at this idea of computers acting like people... you can just as well turn that around, and talk about human behavior like a program fed into an IBM machine.

Thomas Pynchon, Entropy (1960)

The first law of thermodynamics observes that energy within a closed system is conserved; it can be neither created nor destroyed. This is a fundamental property with far reaching consequences, though it reveals nothing of the direction of energy flow - of its spontaneity. This is the domain of entropy, the property introduced by thermodynamics' second law. The so-called arrow of time, it is a measure of the disorder of a system, and with it, the second law declares that a closed system will spontaneously move from order to disorder - famously predicting the eventual heat-death of the universe.

Smaller, but no less out of its domain is biology, and its complicated slurries of processes, sometimes viewed as thermodynamically separate, by the gift of life, from the inanimate. Ordered and complex, life seems to defy the second law, though it commits no violation, as the decrease in entropy arising from its order is compensated by a greater increase to its surroundings as it dissipates energy. But does life merely obey the second law, or could the law itself be the driving force for its origin? And can the entropic rules of biology be wielded to our benefit, analogous to our early leveragings of entropy, such as the extracting of work from a steam engine?

-

The discovery of entropy arose from the industrial age and the drive to extract more work from the steam engine. Sadi Carnot identified that work could be extracted from heat gradients, positing that the only two properties of an engine important to its theoretical work efficiency are its hot source and cold sink temperatures. Kelvin and Rudolf Clausius furthered the concept, and ultimately the classical concept of entropy was born:

dS = δQrevT

Though this was crucial in maximizing mechanical engine work, it only provided a macroscopic understanding. Ludvig Boltzmann sought to explain the concept on a particulate level, quantifying a statistical entropy; the number of possible microstates available for a system in thermodynamic equilibrium; consistent with its classical entropy:

S = KblnΩ

Entropy is ubiquitous in the chemical engineering discipline; It governs thermodynamic cycles such as the Carnot and Rankin cycles, still used today to convert chemical energy to mechanical through combustion. It finds application in predicting the behaviours of mixtures, and the spontaneity of reactions.

-

But what of entropy's influence on biology? Take the energy landscape, a mapping of possible states of a system. Applied to molecular entities, let us say, a protein chain, the landscape represents all possible folding configurations1. These are vast, yet nature dictates that only very few configurations exist in relaxed form. The landscape gives a surface with a free-energy altitude, and the wells correspond to the correct configurations - the ones with the lowest free-energy - itself a function of entropy.

The energy landscape is a map of nature's protein folding algorithm; one in which configurations are sampled and the landscape traversed until energy wells are found; It is also one which, although present in the most basic lifeforms, even the most sophisticated supercomputers cannot match, in their predictive power. Yet the biological energy landscape closely resembles the gradient descent algorithm employed in-silico by neural networks to optimize their classification accuracy2. In a sense, the cells involved act as biocomputers with a thermodynamic driving force. Protein folding is but one example; In the future, could we frame other computationally complex problems, into formats that could be acted on and solved by biological processes following entropic compulsions? Or at least, could existing computation methods be optimized with entropy governed biological processes?

top-down visualization of a gradient descent landscape

To make plausible these fantasies, consider for example the gradient descent algorithm already discussed. Large bacterial swarms have been shown to collectively follow this very algorithm to find food, by sensing chemical signals associated with nutrients and following the direction with the greatest gradient (the greatest flux of nutrients) - a process called chemotaxis3. For the bacteria to process this information, there is an entropic transaction, and it has been established that the increase in entropy is maximized by the greatest gradient4. Hence, the algorithm maximizes the entropy, in accordance with the maximum entropy principle. Is this entropy increase actually the driving force for this microbial problem-solver?

One proposed solution to that question is implicated in a more daunting question; Why did life assemble in the first place? Physicist Jeremy England's theory of dissipation driven adaptation would assert that yes, the entropy, or more precisely, that the dissipation of energy arising from the bacterial information processing, with which the entropy increase is associated, drives the gradient descent. The theory, in fact, asks whether this is the driving force for the genesis of life - a spontaneous assemblage, formed for its capacity to dissipate energy thrust upon it - increasing entropy for the system at large5.

England's lab has published supporting simulations, and has demonstrated that simple chemical systems are able to modify their bonding to facilitate dissipation of incident energy6,7. These findings support the theory as a novel statistical mechanics interpretation of non-equilibrium systems. To validate the theory as a driving force for the origin of life is a far greater, and more significant challenge; a reaffirmation that the entropy is ubiquitous and perhaps an encouragement to seek it out in unexpected applications.

-

“The idea of dissipation of energy depends on the extent of our knowledge”, predicted Maxwell. To his contemporaries, this may have seemed a challenge to squeeze more work from their steampunk contraptions. Today it reads more as an invitation to search for the energetic and entropic underpinnings of the spontaneous flows in our surrounds, where new paradigms in computing, and other fields, may be there for the taking. And parallel to all this coding, our understanding of ourselves may take on profound shifts, too, blurring the distinction between the programmers and the programmed.

  1. Onuchic, J. N., Luthey-Schulten, Z., & Wolynes, P. G. (1997). Theory of protein folding: the energy landscape perspective. Annual review of physical chemistry, 48, 545–600.
  2. Ruder, S. (2016). An overview of gradient descent optimization algorithms.
  3. Falke, J. J., Hazelbauer, G. L., Parkinson, J. S. (2015). Signaling and sensory adaptation in Escherichia coli chemoreceptors: 2015 update Trends Microbiol, 23 (2015), 257-266
  4. Endres, R., Micali, G. (2015). Bacterial chemotaxis: Information processing, thermodynamics, and behavior. Current Opinion in Microbiology 30.
  5. England, J. L. (2013) Statistical physics of self-replication Journal of Chemical Physics, 139.
  6. England, J. L., Kachman, T., Owen, J. A. (2017). Self-Organized Resonance during Search of a Diverse Chemical Space, Physical Review Letters, 119.
  7. England, J. L., Horowitz, J. M. (2017). Spontaneous fine-tuning to environment in many- species chemical reaction networks. PNAS, 114.

2022