This chapter presents the idea of biological hypercomputation (BH) and discusses how and why it entails degrees of freedom. Crossing a biological and computational point of view, the claim is made that living beings cannot be considered as machines in any sense of the word, the arguments are provided, and the consequence is drawn: the complexity of life is the very process by which living beings gain degrees of freedom. The leading thread for the analysis here is the relationship between matter, energy and information.
Biological hypercomputation (BH) is the title that summarizes the fact that living systems are not machines, and hence, do not process information like any machine, no matter what. The leading thread, thereafter, is to understanding the very way in which living systems process information. For a living being processing information is after all a matter of life or death.
A normal understanding of the issue would claim that living systems receive information from the environment and then process it, so that they would be able to learn and adapt to the changing environment. This text claims that it is a wrong take. Instead, it will be argued that there is no information before the very act of processing, and as a consequence, there is no information after the processing. Reality is the outcome of the very process of information processing.
To be sure, ranging from the various Turing Machines (TM)—u-TM, o-TM, d-TM, and many others, up to the various levels of life, say, from bacteria to living cells to plants to animals, etc., there is a growing process by which new and previously unforeseen degrees of freedom are reached. Reality, nature and the world can be seen as an increasing process of degrees of freedom. This view, however, should not be understood hierarchically. Rather, the architecture is to be seen as a fractal organization. As the growing architecture of life increases, an amplification of previous lower layers is produced, that does not obliterate or supersedes the lower levels.
To be sure, the process by which the living systems gain degrees of freedom coincides with the entire weave of networks, fractal architectures, and interdependencies among layers, planes, and contexts. As a result, life and nature can be seen as a fantastic interweaving of times, layers, and interdependencies.
A number of works have been done that permit the approach of this text. The various authors, takes, and approaches will be discussed along the text, which brings a fresh state-of-the arte about biological hypercomputation and the weave of information processing in nature.
Throughout this chapter the basic argument is that life can be seen as a large cooperative game by which the increasing complexity of life lies on the very increasing processes of information processing.
2. A serious problem: what is information (one more time!)
In order to explain nature band the universe, the eighteenth century invented or created a physical concept, namely mas or matter. Thanks to it, the entire universe could be explained in terms of three basic but elegant laws. This was I. Newton’s achievement. Anything that is or happens is the outcome of what happens to mass, thus: relationships of action-and-reaction, the relationship between two bodies based on a force equal in magnitude and opposite in direction with each other, and the heaviness or mass of a body so that the heavier body will always attract the lighter body to itself, i.e. gravity.
In the framework of the study of heat a brand new science originating both in physics and chemistry arose in the nineteenth century that introduced a quite different new concept to understand and explain nature and reality, namely energy. Whereas mass or matter is a univocal concept, energy shows a manifold of ways of existence: caloric energy, kinetic energy, potential energy, and others. The contributions of scientists such as Fourier, Carnot, Boltzmann, Lord Kelvin, and others, had as a result the identification of three laws of thermodynamics. Those three laws were sufficient to explain not only the dynamics of the universe and nature, but also the arrow of time and the challenges it arose.
Now, the twentieth century discovered still another physical concept highly more complex and different form the two original ones. This was the concept of information, originally brought out by Shannon and Weaver, which, however, was to know still further developments manly due to the research on cryptography, quantum theory, quantum computation and quantum science, mainly. The crux here is that information is a physical and yet non-tangential, non-material concept—in contrast with mass and energy .
Now, the relation among the three physical concepts can be adequately identified as follows:
Meaning that energy explains better and deeper what matter is supposed to explain, and that information explains still much better and deeper what energy was supposed to explain. Thus, the story from classical mechanics to thermodynamics to information theory is the very story by which a higher and more achieved understanding of reality is accomplished. This, however should not imply by any means a linear and necessarily accumulative process, but a story of improving, gaining more knowledge and wisdom about the universe, nature, and society. The story of knowledge can very appropriately be grasped in terms of bifurcations .
The amount of information of a system depends on how probable an event is. In other words, the higher the probability that an event takes place, the lower the amount—and the quality—of the system. And vice versa: the lower the probability that an event happens, the higher—and better—the information of that event. Briefly said, the information is inversely proportional to the probability—of an event. This means that information is a measure of how surprising something is .
As it can be seen, the characterization I have provided about information does not necessarily coincide with the standard view—say, Shannon- that identifies information and entropy. Rather, entropy can be adequately grasped as the quantification of randomness—of a system.
Therefore, it can be easily stated that the amount of information in the universe as a whole can only increase, which means that what was considered as an expanding universe in terms of the transition from energy to matter is really an expansion of information . This coincides with Formula (1) presented above. Information is the concept or reality that encompasses and explains energy and matter (mass). Whereas matter presents a view of the world rather disconnected (indeed, it is the laws that gather and unify matter, according to Newton), and energy unfolds and transforms as a set of expressions and processes explained in terms of Fourier’s and Boltzmann’s laws, information set out a convergent, yet in-process, reality that allows for evolution . Formula (2) expresses the unity of matter, energy, and information, showing hence that there are not three things but only one—read thereafter in terms of information, according to Formula (1).
Formula (2) serves as a synthesis that claims that matter, energy, and information are one and the same thing that were historically discovered or brought out by stages. That story is exactly the story that leads from classical mechanics to thermodynamics to information theory.
Now, summarized, information theory is cryptography, quantum information, entanglement, and teleportation . These are three different faces of one and the same development—an achievement that crosses the most conspicuous technologies nowadays, as well as spearhead science in any domain you wish. By and large information pervades both science and culture, currently.
A short survey to the history of computation may be very helpful here. Indeed, information processing encounters five main levels of development, thus:
Linear, sequential, top-down information processing. Amply, this is the largely predominant paradigm in computation and computing science. Ever since the first computers, Eniac I, Eniac II, etc., on until now, computers and computations are defined by the Von Neumann architecture, and the Church-Turing thesis. A programmer programs a program and the computer follows the indications and directions. Design, parametrization, control are the key operations and ways of working and dealing with computers and computation.
Dynamic and evolving information processing. Genetic algorithms, time series, and the first steps in artificial intelligence (AI) such as the works by Ray, Conway, and neural networks, can be said to define the second layer of information processing. The shift toward bottom-up concerns and interests is produced that largely enhance computation as a dynamic process. Not eventually modeling and simulation come along with literal explosion of programming languages aimed at a variety of goals and abilities.
Biologically inspired information processing. More recently bio-inspired computation takes as model or guide the behavior of living systems—say, DNA, proteins, ant colonies, etc.—and develops programs that come closer to the way living beings behave. This kind of information processing has shed so far brand new fresh lights on to the very way in which computing science had being previously conceived. Thus, computation and biology at large come closer and learn from each other making computation much more flexible and robust at the same time. Adaptive computation comes to the fore.
Biological hypercomputation. On the basis of the third layer, trying to understand the way in which living beings process information clearly entails a distance from classical computation, no matter what. Such a computation has been named biological hypercomputation. This is the core of this text.
It is clear that throughout the four mentioned layers, the very technological capabilities are developed and enhanced, for example via the so-called Moore law (after G. Moore, then president of Intel), that deals with the speed of change of the processors in computers. Undoubtedly, information processing has been changing the very understanding of computing and information, as well as it has been the subject—ultimately of software engineering as well as from hardware engineering—the two basic domains of computation, roughly said.
From the standpoint of quantum theory, information sets out the ground to unify epistemology and ontology. The link that allows such a connection is the wave function. Quantum science does not allow for the distinction, and even less the hierarchy, between ontology and epistemology, on either side. Instead, physics is about what we can say about nature—not about what nature is, any longer.
To be sure, the concept-behavior that allows for the unity of ontology and epistemology is the concept of entanglement, a concept firstly introduced by Schrödinger (Vershränkung), but truly implemented by J. Bell. When entangled, the unity of two or more particles becomes more important than the particles individually considered. Entanglement is not really entanglement of entities but of information . Exactly in this sense, too, teleportation is not so much the transportation of a physical entity, what of what makes that entity as such, namely information. It is information, indeed, what is teleported, not the thing as such, in its materiality. Matter is information, which brings us back to Formulas (1) and (2).
Rightly understood, we can safely say that there is information processing in nature. Briefly said, nature can be understood in terms of information.
3. Living beings process information: a complexity understanding
Living beings do not just read the environment; moreover, they accordingly unceasingly create brand new information that was non-existing before. In other words, living beings read the environment but they also, at the same time, write on the environment they are reading on—if the analogy is permitted here.
Whereas any Turing Machine (TM) processes information top-down, sequentially, linearly, and mechanistically, according to the prevailing Von Neumann architecture of TM-computers, living beings, to say the least, process information in-non classical way if the framework is the Turing-Church thesis.
This text argues that the processing of information among living beings consists not only—and not so much—in rightly reading and interpreting the environment and the surroundings, but also—and mainly—in introducing or creating brand new information into the world, correspondingly.
Thus, over against a popular understanding of the issue, information processing does not have anything to do with things such as “following a thread,” “analyzing,” nor even “understanding.” Processing information can rather be grasped in biological or medical terms as metabolizing, i.e., changing one thing—say A—into another, B, for instance -. Good metabolizing, hence, transforms a nice meal into a poem or a scientific paper, or also a nice rest in the evening into insight and strength, for example.
For the living beings, processing information is a matter of life or death—in that a bad information processing may entail danger, peril or death. Biologically or culturally speaking, information processing might entail identifying a good male or female a god territory, the presence of a friend or an enemy, who truly loves you and who just pretends to love you, which food is healthy and which poisonous, for instance. Living beings that process information rightly may encounter better circumstances for adaptation and learning, and be literally the fittest. Evolution and adaptation are after all, it appears, a matter of good information processing.
Yet, life is not just an emergent property of the universe. A number of authors have claimed that the universe herself is alive and conscious [8, 9, 10]. The claim is hence about consciousness not as an epiphenomenon of the universe, and life as an emergent stance. In this take, life is an essential feature of the universe that can be traced back to the very processing of information, precisely. For the sake of brevity I shall put such a claim here into brackets and focus on the way in which living systems do process information.
Whereas at the same time living beings—from bacteria to cells, from organisms to biomes, for example—continuously read the environment and the surroundings they create brand new information that is brought into the world in a variety of forms, thus: as acts and actions, as behaviors, as creation in same cases of tools and rods, as the use of tools as devices aimed for a certain goal, definitely as language and communication, and as forms of organization, whether individual or collective . It is the very creation of new information into the world, which is usually grasped as adaptation. Adaptation is the biological concept that originally is the outcome of information processing.
As it can be easily seen, living beings compute. Moreover, life without computation is inconceivable.
Computation is therefore a concept that truly means interaction—with the environment, and with other living beings. Natural computation is much more than a metaphor, but the distinguishing feature of living beings. Life is a large weave of computation that takes place in a manifold of ways simultaneously, as follows:
There is classical computation in nature in either form of a Turing Machine. At the same time there is parallel computation, and multilevel computation, non-local computation as well as distributed computation, fuzzy and random computation very much as also quantum computation and emergent computation, not to mention interactive computation. A typology of computation and computational models can be seen in . A metaphor can be introduced here, namely a variety of computations corresponds to the diversity of life and living beings. It appears that ecology and biology go hand in hand with information theory and computing science, even though the two latter are more recent that the two before .
To be sure, biological hypercomputation is the way in which living beings process information—is anything but machines. However, there are a number of explanations within the health sciences and the living sciences that still operate mechanically. Thus, for instance, the functioning of the brain many times is explained in terms of “on” and “off” switches or operations, and the very functioning of the heart is usually conceived as a pumping machine, period. Many other cases can be introduced here. The crux is that those explanations still owe a big deal to the past and remain very short vis-à-vis more contemporary and spearhead explanations. Computing sciences is such one of those frontline explanations, under the proviso that it be not a Turing Machine one, in any concern.
Biological hypercomputation (BH) has been introduced  to mean that life and the living processes cannot be understood from lower stances: in other words, life is to be explained as it happens: life is an uncompressible “program.” The most crucial and basic problems of living beings are certainly not tackled in terms of mathematical functions, whatsoever. Instead, the problems living beings faced are normally solved in terms of non-classical logics, a field that has not deserved as much attention as its own value.
More fundamentally still, computationally speaking, the difference between software and hardware is irrelevant. More exactly, it does not exist. The phenotype expresses the genotype according to evolution and the environment. Epigenetics, for example behavioral epigenetics or even symbolic epigenetics, becomes central here, for it most adequately expresses the idea that very much as there is no difference between software and hardware, in the same tenure, there is no difference between nature and culture. They are closely intertwined and cannot be divided or split.
As a consequence, computing (= BH) is for life one and the same thing with evolution. Evolving and computing are two faces of one and the same token. Evolution is the process through which living beings gain degrees of freedom.
4. Living beings gain degrees of freedom
Information processing among living beings takes place as an unceasing creation of information that is introduced into the world. Thus, information processing is not just reading the environment and adequately interpreting it, but, moreover, correspondingly, bringing new information into the world in the form of actions, processes, behaviors, or achievements.
There is no information before the information processing by living beings, very much as there is no information after the information processing by living beings. Information exists in so far as it is processed, that is, both created and changed. In other words, there is no reality previous to the processing of information, but neither is there any reality “outside” the information processing. Yet, the universe becomes increasingly complex precisely thanks to the unceasing processing of information, i.e. the reduction of entropy in the universe. Life is that system that creates order and exhibits the best form of order possible.
As a consequence, the universe is alive it appears. A number of interpretations in this sense can be mentioned that challenge the traditional view of the universe that goes as follows:
I argue that Formula (3) is erroneous because it is set on the physical predominance of physics over biochemistry or biology, for example. Formula (3) is really a translation of Formula (1), and thus, it sets out the ground for a reductionist view of nature and the universe. A clear reductionist approach serves as ground for that formula.
The increasing complexity of reality and the universe is one and the same process as the increasing complexity in the information processes [13, 14]. We have been gaining an enormous field in understanding what life does and how it is made possible. Thus, for instance, regarding the realm of plants, Refs. [15, 16] have shed solid lights about how they think, literally. Down the scale,  did steadily study the way in which information processing takes place among bacteria. Furthermore, information processing has been studied additionally among social insect networks . The range of studies and the cope is always wider and deeper, encompassing animals at large , until of course we reach the level of human beings, and computers and computation.
The concept “degrees of freedom” can be understood from a wide take that says that it consists in the number of parameters a system exhibits or has that may vary independently (Wikipedia) until the calculation of the various scenarios in a system or the number of dimensions that are needed to determine a full vector. In complexity science or theory the concept is pivotal and allows for crossing different approaches, sciences, discipline, and interpretations aiming at making explicit what complexity is all about.
We can safely say that the higher the degrees of freedom a system has the more complex it is. And vice versa, the lower the degrees of freedom, the less complex it is. Formula (4) synthetizes this:
Now, living beings are what they do. And the best they do is information processing, exactly. The better a living system processes information, the fittest it is. As a consequence, the more degrees of freedom it gains. Hence, living is a matter of gaining degrees of freedom. Death, it seems, is the complete loss of degrees of freedom. All possibilities are then closed.
Therefore the claim can be made that information plays an “active” role in the living systems, and a “passive” role in classical physics. Analogously as it happens with any Turing Machine, namely information is something that “happens” in the processor that is conceived as a black box. The audience or the researcher knows what enters (input) and knows what comes out (output), but ignores what and how happens in the black box. Classical computation seems to reduce the degrees of freedom by accepting or assessing the processing of information as a “black box.” On the contrary, (BH) is the process by which new degrees of freedom are gained in so far as new adaptations, networks and behaviors are made possible, for instance?
Information processing does not stand as a condition for evolution and adaptation. On the contrary, it is the very process of evolution both of the living beings and the environment. Coevolution is the name that best suits both stances in their entanglement. Living is nothing else that coping with the information of the surroundings and accordingly changing that information in terms of brand new information brought into the world.
The better the living system processes information, the better it copes with challenges, solves problems, adapts to a continuously changing environment. Life ends when it “overdoses” of information and the living being is unable to process it. A saturation point is reached and the system can no longer process any further information .
Summarizing, life consists in the process by which information processing is carried out and hence unceasingly new degrees of freedom are reached. The system can be said to be “young.” Contrarily, death happens when the processing of information is not possible any longer—it becomes slow, saturated, new information cannot be processed accordingly, and the “screen” slows down in bringing out the information required.
Life is a matter of “controlling” the saturation of information, and thereafter, of information processing. Software and hardware are one and the same thing.
5. The weave of life
Information is a continuous process that emerges as a result of intricate non-linear interactions of living beings among themselves and with their environment that both creates nature and the world as it happens, and at the same time changes it unceasingly. This is exactly the story of learning, adaptation and coevolution.
As it is well known, living beings evolve in rugged adaptive landscapes—a concept originally introduced by Ch. Darwin in 1859, originally named as fitness landscape. Life is a wonderful complex weave that has no center, but many bubs, clusters and nodes, continuously changing on the ever-changing environment on earth. Only that the timescales in geology are vastly deeper than those of the living beings, not to mention human beings. Indeed, whereas the basic time-scale of life is the decade or even the month for some species, the basic time-scale in geology is 1000.000 years. Against all odds, this is the ultimate time-scale of life on earth.
Life is a large and robust weave of cooperation, commensalism, and mutualism, rather than competition and predation.  has made on this subject a great contribution, namely the importance of symbiogenesis: a large network (instead than a chain) of mutual interdependency among living beings so that each organism and species benefits from others, and vice versa.
The story of life is a story of increasing complexity, i.e. biodiversity. Such diversity is said to be at three levels, thus: genetic, natural, and cultural. The countries that have the three kinds of biodiversity are called “megadiverse” and they are 17 countries, to-date—all of them place on or very near the equator.
The story of life pivots around six fantastic moments of increasing diversity followed by six massive extinctions. At each stage, life has made of herself a wonderful asset of possibilities, forms, shapes, structures, behaviors, and characteristics that correspond with dynamics through which, as a whole, amazing degrees of freedom are been attained or reached. And yet, that story is non-teleological. In other words, life has no ends or goals, and evolution has no purpose whatsoever. Such was indeed the scandal that Darwin’s The origins of species by means of natural selection (1859) meant particularly vis-à-vis the cornerstone of the western civilization, namely teleology . The western world needs believe that there are telos, and that telos are necessary.
To the discomfort of those who believe in goals end, whether their own or imposed or suggested, the theory of evolution—namely the best theory ever developed to think about change and transformation—introduces the idea of the absence of goals or ends (telos). Well, the theory of symbiogenesis comes to complete, so to speak, the theory of evolution by showing that the various stages of complexity of life, firstly, and secondly that the whole weave of life naturally tends toward increasing complexity and hence to cooperation and interdependency.
As a consequence, living beings are symbiont: not only it is a cooperative interaction but also a prolonged one among different species and/or organisms. Mutualism and not competition is the rule in nature, it appears. Nature is the realm of freedom par excellence.
We have three concepts: matter, energy and information. However, a right understanding of them brings to the fore the fact that they are not three, but one and the same concept: matter-energy-information. Moreover, information can be rightly understood as fundamental (= grounding) energy and mass.
What happens to matter is truly a matter of energy and relations among various types of energy. And still the very processes, dynamics and configurations of energy are dynamics in and of information. Formula (4) expresses this understanding.
Formula (5) simply means that matter explains less (and worst) what energy does explain, and furthermore, information allows for a deeper and better understanding of what energy means to explain.
Now, when looked under the light of life and how life is possible, the three concepts are transformed into one process, namely biological hypercomputation. In the processing of information among the living beings matter, energy and information become one single unity: life. Thus, life allows for the overcoming three different layers of reality (physics, thermodynamics, and information); such overcoming is also the non-differentiation between software and hardware. Processing information for life consists in gaining degrees of freedom.
Living beings process information much more than (just) matter and energy, which they definitely do. Evolution can be seen as the processing of information processing by living beings through which the world and nature are complexified. Information allows for the rejection of a reductionist approach of life and nature.