Next Article in Journal
Information and Physics
Next Article in Special Issue
The Role of Multimedia Content in Determining the Virality of Social Media Information
Previous Article in Journal
Beyond Bayes: On the Need for a Unified and Jaynesian Definition of Probability and Information within Neuroscience
Previous Article in Special Issue
What Is Information?: Why Is It Relativistic and What Is Its Relationship to Materiality, Meaning and Organization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Physical Computation as Dynamics of Form that Glues Everything Together

Design and Engineering, School of Innovation, Mälardalen University, Västerås, 72123, Sweden
Information 2012, 3(2), 204-218; https://doi.org/10.3390/info3020204
Submission received: 1 March 2012 / Revised: 14 April 2012 / Accepted: 18 April 2012 / Published: 26 April 2012
(This article belongs to the Special Issue Information: Its Different Modes and Its Relation to Meaning)

Abstract

:
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability to carry out a process, which corresponds to computation. The relationship between each two complementary parts of each dichotomous pair (matter/energy, structure/process, information/computation) are analogous to the relationship between being and becoming, where being is the persistence of an existing structure while becoming is the emergence of a new structure through the process of interactions. This approach presents a unified view built on two fundamental ontological categories: Information and computation. Conceptualizing the physical world as an intricate tapestry of protoinformation networks evolving through processes of natural computation helps to make more coherent models of nature, connecting non-living and living worlds. It presents a suitable basis for incorporating current developments in understanding of biological/cognitive/social systems as generated by complexification of physicochemical processes through self-organization of molecules into dynamic adaptive complex systems by morphogenesis, adaptation and learning—all of which are understood as information processing.

1. Introduction: The Question of Substance

“There is some elementary but not widely understood theory that glues the whole thing together.”
S. B. Cooper [1]
The question of the relationship between substance and form is old. For Plato the ideal forms were the governing principles of the world, while material objects were their imperfect copies. Aristotle (Metaphysics, Book VII) maintained that a specific substance is a combination of matter and form. In the book VIII of Metaphysics, Aristotle concludes that the matter of the substance is its substratum [2].
Gregory Bateson takes difference to be one more fundamental element besides form and substance and claims that “information is a difference that makes a difference” [3] (p. 318). This paper will argue that we can basically manage with only one structural principle, because matter, form and difference for an agent all boil down to information. Information understood as a structure, as related data. The relations among the data are established by an agent and the first distinctions are made between an agent (subject, the entity acting on its own behalf) and the rest of the world. Even though an agent is made of the same “stuff” as the rest of the world, protoinformation [4,5], it distinguishes itself from the rest of the world by being an autopoetic self [6].
Since Aristotle’s time we have learned more about the world from sciences. As a consequence, an agent-dependent understanding of nature is emerging, including an agent-dependent idea of “matter”. Agent-dependency means that we make explicit what sort of system an agent/observer is, and what sensors, actuators, information-processing mechanisms he/she/it possesses. It does not imply that an independent existence of the physical world is denied, and it is does not claim that the physical world appears arbitrary or subjective for cognitive agents. The physical world evidently possesses stable structures and follows physical laws, but the knowledge of the world (understood as structured information) is agent-dependent. Maturana and Varela identify life with cognition, so any living organism possesses some degree of cognition [6]. A slime mold has a set of inputs and vital processes and possible kinds of interactions with the world different from an insect and different from a mammal. The conceptualizations and the ways of interaction with the physical world, including other living beings are different for different organisms. We humans have successively become aware of many levels of organization of the physical world, many more than what is directly detectable by the five sense organs of the human body.
Especially valuable for our understanding of cognitive functions in humans are the newly learned lessons from computing, with its variety of relevant fields such as artificial intelligence, artificial life, robotics, data representations and processing, networks, physical computation, memory, virtual machines, formal languages, natural language processing, computational linguistics, multi-agent systems, internet of things, etc. Constructing intelligent machines and robots helps us see human cognitive capacities and modes of knowledge production, decision-making and behavior in a broader context of cognitive agencies—biological and artificial.
As already mentioned, in the Aristotelian world things were made of “matter” which had a form. Modern physics however shows that what appears as matter on one level of organization becomes form (structure) on the next lower one. An example is that a rubber ball is understood in Aristotelian terms as matter (rubber) having a form of a ball. But on a more basic level, rubber is a hydrocarbon polymer and that is a form made of carbon rings with hydrogen attached, so the polymer structure is a form and the atoms are the matter it is made of. Atoms too are structures and their “material” constituents are nucleons and electrons. And so the process of analysis can go on.
Is there any lower limit to what we can find as structures in nature? No! Investing huge amounts of energy we can “provoke” noumenon to produce yet unknown phenomena. Is nature really made of all those strange particles? The question is equivalent to asking how the noumenon really is when we do not interact with it. It is an ill posed question.
However, we know from the accumulated common experience of humanity that the world exists and is remarkably stable on human time scales. That is why we are able to reproduce physical experiments under given conditions. The stability of the world governed by “natural laws” is the basis for every epistemology and indeed a precondition for life as well.
What we call “matter” appears to be a recursive structure of the Russian doll down to elementary particles. But elementary particles and other quantum objects are unusual instances of “matter” that can turn into energy and transmute into other types of matter. Quantum mechanical objects have no definitive place in space and exhibit strange quantum-mechanical behavior such as entanglement. Searching for “matter” going deeper and deeper in length scales and levels of organization of physical objects we find ourselves in a really odd microscopic world.
On the large macroscopic scale things get even more perplexing: The universe seems to contain less than 5% of matter (everything ever observed), roughly 70% of the dark energy and 25% dark matter, and we do not really know what they are (according to NASA’s web page Dark Energy, Dark Matter). Similar numbers appear in [7].
It is worth underlining that the structures we find on smaller and smaller scales are not only dependent on how the nature, the Kantian Ding an sich or “thing-in-itself”, is, but also how we interact with it. If we look at an object in infrared light, it will appear different from the same object in ordinary light or in X-ray. Both the object and the observing/interacting agent with its specific types of interaction make difference when it comes to what type of view of nature becomes visible. The world is more than what we know—with new types of experimental and theoretical tools and through interactions with other cognitive agents and by restructuring information in already existing knowledge we will learn more and differently. We evolved in the natural world as a part of that world which we only partially know and we successively learn more about. So the idea of noumenon (as a potentiality and a resource) is a very useful concept in epistemology.
We may call this world noumenon [thing as such] or “matter-energy” (with space-time as their attributes), but we can also call it “proto-information” or “potential information”. We create new knowledge about the world based on previous knowledge with the help of instruments—machinery and theoretical tools and in the context of social environments.
Epistemologically differences are central, both in governing the behavior of a living being and in the construction of knowledge, as all information about the physical world is obtained through the interaction of living agents with the world, so it is about the relation between the cognizing agent (which is a specially organized subset of the world) with the world and with itself. All an organism has to go after, to react to and adapt to are differences. This is a generalization of the basic approach of relational biology [8].
“This suggests that in general there is nothing ‘physical’ about the information content per se. It depends entirely on relations, and the relations can change. In other words, besides material properties, we have to speak about informational properties now, or, going even beyond that, we may realize there exists only the latter, as perhaps the material properties themselves can be conceived as instances of some permanent relation or mode of interaction. In short, instead of single and well-defined causes and actions, we are left with an intricate web of modalities that jointly evoke or define a dominant trait or observed action.” [9].
As Kant rightly argued, we cannot say what Ding an sich really is, but we can explore its many different facets (phenomena) through agent-dependent interactions. In order to really deeply understand our agency in the world we have to learn the constructive mechanisms that connect living beings with the inanimate nature. Deacon [10] provides a good account of that special hierarchical organization of a subset of the world, which is a biological cognizing agent, starting with abiogenesis through self-organization of biological structures, which are used as building blocks in the subsequent construction of increasingly complex organizations.

2. Informational Nature of Matter

A number of recent books suggests a significant movement towards an informational universe: Information: The new language of science (von Baeyer), Decoding the Universe: How the New Science of Information is Explaining Everything in the Cosmos, from Our Brains to Black Holes (Seife), Programming the Universe: A Quantum Computer Scientist takes on the Cosmos (Lloyd), Every Thing Must Go: Metaphysics Naturalized (Ladyman et al.), Decoding Reality: The Universe as Quantum Information (Vedral) and Information and the Nature of Reality: From Physics to Metaphysics (Davies and Gregersen) to name but a few [11,12,13,14,15,16].
When saying that the fabric of the universe is made of information we say that Kant’s noumenon can be identified as proto-information/potential information. As soon as a cognitive agent interacts with the noumenon of the physical world, he/she/it constructs actual information, which—after processing in the organism or artificial cognitive agent—represents a phenomenon for an agent. This model of a phenomenon, which constructs an agent autonomously, will be negotiated in the cognitive network with other agents—a process nowadays hugely enhanced by ICT.
It is widely believed that the “materiality” of the world is the necessary precondition for every scientific world-view; and thus it is very important to understand what this materiality actually amounts to nowadays.
In his new book A Universe From Nothing [17], Krauss makes an interesting claim that there is a physically plausible scenario for our entire universe to have been developed from “nothing”. Of course, the idea of “nothing” has a long philosophical history and can be discussed, but in this case it coincides with what physicists call quantum mechanical “vacuum”. Vedral proposes an information-based creation ex nihilo based on von Neumann’s algorithm. The empty set { } is a collection that contains nothing and has the cardinality 0. The mathematician von Neumann (1923) invented a method, known as the von Neumann hierarchy, which can be employed to generate the natural numbers from the empty set as follows: Step 0: (empty set) Step 1: (set containing the empty set) Step 2: (set containing previous two sets) (etc.). Starting from an empty set an infinite sequence of numbers can bootstrap their way into existence. This algorithm presents a data compression, which is a result of insights into the structure of the data set [15,18]. Both examples of creation ex nihilo proceed in a constructive manner, building up from basic elements. In short: Everything is about relationships, about both information (structure) and its opposite, entropy (lack of structure) [19].

3. Computational Nature of Process

In its primordial form as proto-information/potential information, information represents the fabric of the universe, noumenon. As a difference which makes a difference for an agent, it becomes a physical/material phenomenon. Its processing lies behind all our biological and cognitive functions and it is fundamental as a basis for all knowledge. Information is for an agent created by the process of differentiation and shaped by natural computation. It is very important to observe that this generalized idea of computation as a physical process is much wider than the computation performed by our computing machinery or represented by the Turing machine model, it is Stepney’s “neglected pillar of material computation” (Stepney 2008). For presentations of natural computing and its relationship to conventional computing see [20,21,22].
Computation always appears in tandem with information (structure, data) [23]. Abramsky [24] emphasizes that “in a computational perspective, it makes little sense to talk about static data structures in isolation from the dynamic processes that manipulate them”. This is in agreement with van Benthem who declares that “structure should always be studied in tandem with a process!” and “No information without transformation!” [25].
A special kind of computational system constructed to perform natural computation and mimic structures and processes in a biological cell is proposed by Kampis [9]. He claims that the Church-Turing thesis applies only to simple systems and that complex biological systems must be modeled as component-systems which are self-referential, self-organizing and self-generating and whose behavior is computational in a general sense which goes beyond the Turing machine model:
“A component system is a computer which, when executing its operations (software) builds a new hardware.... [W]e have a computer that re-wires itself in a hardware-software interplay: The hardware defines the software and the software defines new hardware. Then the circle starts again.” [9].
Even though DNA is seen as an information code and its function is often compared to our current models of (von Neumann-Turing) computation, the information processing involved in living cells is indeed a more complex material computation:
“The notion of code implies that there should be a well-defined reference frame to which the information content can be related or mapped. If, however, the molecules have an information content that depends on the other molecules that surround them, this means for information theory that there is no external reference frame in the first place, and in fact the code for the molecular information content is partially contained in the other molecules that interact with the given molecule. Moreover, because there can be many molecular components involved in the ‘coding’ for the properties of a molecule, and also because this code-determination game takes place on a mutuality basis for every molecule in a reaction network, it is proper to say that what we deal with in a molecular Self Modifying Systems is a distributed code system. The distributed code systems are likely to have new information-theoretic properties, to be mapped by future research.” [26].
Basically, for a process to be a computation, a model must exist such as an algorithm, a network topology, a physical process or in general any mechanism which ensures definability of its behavior [22].
In computer science, besides the classical Turing machine model, other types of models of computation have appeared in the past few decades such as process models (Petri nets, Process Algebra, and Agent-Based models). Formal methods in systems biology followed this development and include rule-based modeling of signal transduction, process algebras, abstract interpretation, model checking, agent-based modeling of cellular behavior, Boolean networks, Petri nets, state charts and hybrid systems. At the same time concurrency models have emerged in a bottom-up fashion in order to tackle present day networks of computational systems and it will take a few years until they reach the shared world view as standard computational tools of thinking.
According to pancomputationalism (naturalist computationalism)—for more details about this framework see [5,27] and [22]—one can view the time development (dynamics) of physical states in nature as information processing, and learn about its computational characteristics. Such processes include self-assembly, developmental processes, gene regulation networks, gene assembly in unicellular organisms, protein-protein interaction networks, biological transport networks, and similar. Natural computing has specific criteria for the success of a computation. Unlike the Turing model, natural computation does not focus on the halting problem, but instead on the adequacy of the computational behavior. The organic computing system, e.g., adapts dynamically to the current conditions of its environments by self-organization, self-configuration, self-optimization, self-healing, self-protection and context-awareness. “(O)ur task is nothing less than to discover a new, broader, notion of computation, and to understand the world around us in terms of information processing. ” [21].
One of the frequent criticisms of computational approaches applied to living organisms and especially to the mind is based on the understanding that computation always must be discrete and that some continuous processes, such as the human mind (and weather), can never be adequately represented by computational models. Here, it is important to realize that natural computing/physical computing includes both discrete and continuous computational processes. Moreover, a continuum can arise as a result of interactions of asynchronous communicating systems. For more details, see [28].

4. Gluing it all Together: Information/Computation—Matter/Energy—Structure/Process in an Organic Whole

If computation is understood as a physical process, if nature computes with physical bodies as objects (informational structures) and physical laws govern process of computation, then the computation necessarily appears on many different levels of organization. Natural sciences provide such a layered view of nature. One sort of computation process is found on the quantum-mechanical level of elementary particles, atoms and molecules; yet another on the level of classical physical objects. In the sphere of biology, different processes (computations = information processing) are going on in biological cells, tissues, organs, organisms, and eco-systems. Social interactions are governed by still another kind of communicative/interactive process. If we compare this to physics where specific “force carriers” are exchanged between elementary particles, here the carriers can be complex chunks of information such as molecules or sentences and the nodes might be organisms or groups—that shows the width of a difference.
In short, computation on a given level of organization is realization/actualization of the laws that govern interactions between constituent parts. Consequently, what happens in every next layer of organization is that a set of rules governing the system switch to the new level. It remains yet to be learned how this process exactly goes on. Recently, simulation tools are being developed which allow the study of the behavior of complex systems modeled computationally. For the analysis of the time development of dynamic systems various simulation techniques are being developed, from purely mathematical approaches, e.g., equation based modeling, simulated by iterative evaluations, to formal modeling approaches, such as Petri nets and process algebra combined with object-oriented and agent-oriented simulation methods based on the emulation of constituent system elements.
In agent-based models, which are a class of computational models for simulating the behavior of interacting networks of autonomous agents, and thus especially suitable as models of complex systems, not only the notion of an agent but also the idea of their interactions is generalized compared to basic cellular automata. What is exchanged during communication between agents can be different and in general not necessarily words or written symbols. Moreover, memory can be added to the system, which changes its behavior substantially [29,30]. Cellular automata are synchronously updated, which according to Sloman makes them computationally less expressive than systems with asynchronous interactions. Agent based models which are currently developed are generalizations of cellular automata and they can avoid those limitations. They are essentially decentralized, bottom-up, in general asynchronous models. (Synchronous communication where agents exchange information all at the same time is a special case of asynchronous information exchange). The behavior is defined at the individual agent level, and the global behavior emerges as a result of the interaction among numerous individuals communicating with each other and with the environment.
How does one connect info-computational models with real-world physical systems? Matter can be seen as related to energy in a way structure relates to process and information relates to computation. Matter corresponds to structure, which corresponds to information. Energy corresponds to the ability to carry out a process, which corresponds to computation. The relationship between each two complementary parts of each dichotomous pair is analogous to the relationship between being and becoming [31], where being is the persistence of an existing structure while becoming is the emergence of a new structure through the process of interactions. What we see as matter at one level of organization appears as a structure on the next more basic level of organization. A solid rock on the macroscopic level appears as atoms and empty space on the level beneath. From astrophysics we learn that the universe can pop into existence from a quantum vacuum. On the bottom there is a lot of space, “empty” space. Does that mean that what we know as “matter” from physics books will dissolve into nothingness? No. Empirical results in physics hold, of course.
The info-computational framework [23] proposes a unified view based on the two complementary ontological principles information and computation. Conceptualizing the physical world as an intricate tapestry of protoinformation networks organized in layers and evolving through processes of natural computation helps making more accurate models of nature, connecting non-living and living worlds. It presents a suitable framework for incorporating current developments into understanding on the level of (meta)-biology, self-organizing processes, morphogenesis, adaptation, learning, role of an observer, hierarchy, causality, dynamic adaptive self-organization and knowledge production amongst others.

5. The Nature of the Relationships

In order to understand the nature of our relations with the physical world including human cognitive structures and behaviors, external as well as internal, we must understand from the beginning the hierarchical chain of processes of self-organization of “matter”: Morphogenetic processes [5] and teleodynamic processes that lead from inorganic matter to the simplest organisms to human beings and societies; from syntax to semantics of “matter”, which on a fundamental level is information.
Deacon [10] distinguishes between the following three forms of information:
  • Information 1 (Shannon) (data, pattern, signal) (data communication) [what it exhibits—syntax]
  • Information 2 (Shannon + Boltzmann) (intentionality, aboutness, reference, representation, relation to object or referent) [what it conveys—semantics]
  • Information 3 ((Shannon + Boltzmann) + Darwin) (function, interpretation, use, pragmatic consequence) [what it’s for—pragmatics]
In the framework of info-computationalism, all three types of information are considered as different levels of organization of information, which start from proto-information and through the interaction with an agent becomes the difference that makes a difference, data communicated through the nodes of communication until they—in an agent—take first the form of actual information by the process of connecting (relating) to the existing informational structures of the agent. In the next step this information will be used to act upon and thus it will exhibit meaning as Deacon’s ((Shannon + Boltzmann) + Darwin) Information 3. However, it is worth noticing that all three Deacon’s information types only differ in structure, and not in fundamental constituents, which are basic data or Bateson’s differences. Deacon’s three types of information parallel his three levels of emergent dynamics, which in Salthe’s notation looks like:
[1. Thermo- [2. morpho- [3. teleo-dynamics]]]
with corresponding mechanisms
[1. Mass-energetic [2. self-organization [3. self-preservation (semiotic)]]]
and corresponding Aristotle’s causes
[1. Efficient cause [ 2. formal cause [ 3. final cause]]]
(while Aristotle’s material cause is supposed to form stable attractor dynamics of a system.)
In the above, thermodynamics and semiotic layers of organization are linked via intermediary layer of morphodynamics (spontaneous form-generating processes), and thus do not communicate directly (so it looks like mind communicating with matter via form).
Teleodynamic processes emerge from the mutual organization of processes performing morphodynamic work relative to each other. An example of a teleodynamic process is the autocatalysis and self-assembly, where autocatalysis produces supporting boundary conditions for self-assembly and the vice versa. The result is the “autocell” in which one or more molecular products of an auto-catalytic cycle self-assemble into a container, thus preventing diffusion of these catalysts. Autocell lineages can evolve, and the individuals of a lineage become the basis for evolutionary adaptation [10].
Biochemical self-assembly is spontaneous like crystallization where molecules spontaneously combine and form different structures while releasing kinetic energy and increasing the entropy of the environment [32,33]. Life consists of dissipative, self-organizing, entropy maximizing processes driven by thermodynamics [34,35].
A number of authors contributed to the line of thought presented by Deacon, among many others [6,26,9,31,32,33,36,37,38,39,40,41,42,43].
One very interesting outcome of teleodynamic processes is the formation of selves, Maturana and Varela’s autopoetic systems [6] or autonomous agents. Connecting to Minsky’s Society of Mind [44] this framework shows how higher order modes of teleological phenomena such as subjectivity and preferences can have causal power consistent with the laws of physics.

6. Open Systems and Non-Equilibrium Processes. Complex Dynamical Systems Theory beyond the Mechanicism

Classical Newtonian physical systems are analyzed in isolation, thus essentially closed for the influences from the environment and typically on one distinct level of organization of its constituents, thus nuclear- , atom-, particle-, molecular-, solid-state- or astro-physics constitute distinct research fields. The classical Newtonian approach is not suitable for modeling of complex systems which typically consist of strongly interacting parts on several levels of organization (length scales) and are often strongly context dependent, i.e., dependent on the communication with the environment. If the constituent parts of the system are not strongly coupled, no complex structure will emerge. Thus, the appropriate theory for strongly interacting systems cannot be of the Newtonian type, but instead the appropriate tools come from the complex dynamical systems theory, agent-based modeling and network theory, which are built upon relationships, and not isolated individual agents/nodes/constituents.
Along with connectivity and openness, a further interesting property of complex systems is that the same dynamics often appears self-similarly on multiple levels of organization, which makes scalability an important topic of complexity. Strongly interactive networks (such as biological systems) give rise to non-linear processes and organize into structures that exhibit order on all levels of organization [45,46]. Self-organized criticality is found to be a hallmark of such systems, as the regime on the “edge of chaos” is the one with maximal informational capacity. Both the information structure of completely ordered systems and completely random systems can be efficiently compressed [47].
Furthermore, unlike Newtonian systems which are typically memoryless and invariant to time reversal, complex dynamical systems are sensitive to the initial conditions. Complex dynamical processes are therefore essentially historical, and Prigogine said that “they carry their history on their backs” [48], their history can be read off of their structures.
Moreover, Juarrero questions the prevailing ideal of scientific theory, which in the classical mechanistic approach expects explanations to be proof-like. In complex dynamical systems, such as studied in action theory, it is obvious that a different approach for explaining actions is necessary such as historical narrative instead of covering-law explanations in the form of deductive inference. It should be added that a similar open-ended new type of explanatory tools is being developed in the form of simulations and other (interactive) computer experiments, as presented by Wolfram [49] and Epstein [50].

7. Fully Fledged Causation—Bottom-up, Top-down and Back. Deacon’s “Absentials”

Through the study of action theory, Juarrero comes to the conclusion that the traditional mechanistic model of cause (“push-pull”) is insufficient as it reduces all causes to the Aristotelian efficient cause [51,52]. She argues that a new causal framework is necessary for complex adaptive systems, which can account for all four types of Aristotelian causes. Causes, seen as dynamical constraints, connect in a convincing and logical way bottom-up and top-down causal relations, including intentional causes. Similar arguments are advanced by Deacon [10].
Deacon [10] defines an absential as a phenomenon “whose existence is determined with respect to an . . . absence” and argues that time, memory, beliefs, norms and anticipations are all absentials, and they represent constraints or boundary conditions which apparently influence our decisions and actions in the physical world. One of the most important “absentials” is information. The view that absentials can exhibit causal power differs from the Newtonian mechanical nature where only that which is present can cause any effect.
Absentials represent constraints, and that is how "that which is not" affects that which is. All teleodynamic systems are defined by different constraints. The constraints are evident in the directed development of organisms or the limited patterns of behavior they may exhibit: Living matter is limited to certain developmental trajectories. Deacon’s account emphasizes that mind does not emerge from matter but from the constraints on matter [that govern the dynamics of the processes which we know as mind]. Evolutionarily, constraints (boundary conditions) lead to the emergence of higher-level properties (processes realized in structures).
But these constraints on matter cannot exist without matter! The blue color of an ocean is not a property of a water molecule but is an emergent property of a huge number of such molecules together. The gravitational force of a sun is invisible on one Helium atom but it emerges from a huge number of such atoms stuck together. To say that mind does not emerge from matter is not the whole story, as it obviously does not emerge from the absence of matter. In other words, Deacons “absentials” only make sense in the conjunction with “existentials”. This view is confirmed by the following explication:
“The one thing common to all examples something absent is causally significant is the presence of a habit, or a regularity, with respect to which something missing can stand out.” [53].
This also can be translated into the description based on process and structure where process dynamically traverses spaces between “existential” and “absential”, the actual and the potential. Absentials stand for dialectic relationships such as described by Brenner’s Logic in Reality [54].
Rosen talks about “interaction sites” [8,55] which also are absentials in Deacon’s sense, and if we think about pattern, it too is defined both by what is present and what is missing—the relationship between figure and ground.
If we ever had a feeling that it is possible to reconstruct the world by knowing only about the foreground, about “existentials” without mentioning what is “absentials” that is only because it was tacitly assumed that it is obvious and beyond question what those “absentials” are and what they do.
Related is a question of anticipation [55], which also can lead to “absentials”. The basis of anticipation is memory—the ability of an agent to connect past events with the present, based on the ability of its body to keep trace of past events by changing morphology (informational structures). Memory is a central feature of intelligent agents and yet not well understood. But it is evident that both memory and anticipation play major roles in an organism’s ability to survive.

8. Conclusions

“I believe that consciousness is, essentially, the way information feels when being processed.” [56].
The traditional materialism today appears as a doctrine that needs reconstruction. Mechanistic models do not work for complex systems and especially not for living organisms. That definitely does not mean that space opens for mythopoethic models of the universe. Physics that has worked well until now continues to work well. But one thing we have to understand better is the nature of our relationship with the world as observers and agents. One of the learning strategies is to turn the focus inwards and learn who we are as cognizing biological agents. What we can say about the world given the structure we possess that defines the processes governing us as agents and our agency in the world. We should understand the relationship (and that is the central thing) between humans as an agent in the world, physical interactions with the world. We need to understand that proverbial “observer” as an agent in the world, not a “material point” but an agent with its structures, in a context. That is important as we are becoming able to construct artificial cognitive agents, which need not have exactly the same cognitive characteristics as humans. We are optimized by evolution to survive in a world as it appears to us on our everyday level of organization. It is not impossible to imagine intelligent cognitive systems optimized according to some other principles, which might be more intelligent and capable of uncovering structures, relationships and maybe even able to anticipate things that we are unaware of.
The developments supporting info-computational naturalism are expected from a variety of sources, amongst others complexity theory, theory of computation (organic computing, unconventional computing), cognitive science, neuroscience, information physics, agent based models, information sciences, bioinformatics and artificial life as well as theoretical biology. Deacon’s book can serve as an exoskeleton that can support the body of knowledge being developed at the moment as well as smoothly relate to the existing knowledge developed over the past several decades in the work of Maturana and Varela, Rosen, Kaufmann, Juarrero, Collier, Matsuno, Salthe, Ulanowicz, Logan, Thompson and many others.
When it comes to the opposition between organisms and machines, which is many times emphasized [23], one thing should not be forgotten: Machines are nothing given once and for all; they are constantly changing. New developments (and that is based on our increased ability to handle information and compute which is used in control) aim at machines with self-* capacities (self-organization, self-repair, self-control (autonomy) etc.). In other words we are learning from natural organisms how to cope with the complexity of the physical world. The chances are that we will soon have nanomachines with self-* capacities, thus facing the new phenomena apart from mechanisms and organisms: Mechanical organisms and organic mechanisms.
Galileo-Newtonian physics, a 370 year-old science, has presented an ideal of exact science for centuries and it presents a framework for the majority use even today. However, paradigm shifts, indirectly caused by the unprecedented development of computational technology, initiated developments in many fields completely inaccessible to Galileo, Newton and generations of scientists to follow, because they demand heavy observational, experimental and computational resources that have become available only recently. That is how the field of non-linear adaptive dynamical systems started to flourish based on info-computational technology: Both the resources for massive calculations but even the possibility to communicate results and to search vast data bases and other resources on the web. This development is expected to continue and hopes are big that new computational devices will be instrumental in reaching even higher levels of info-computational proficiency.
Along the way, development of artificial intelligence such as robots has taught us that our ideas of intelligence were misguided, that biological intelligence is embodied and that vision demands much more computational resources than symbol manipulation in deductive reasoning. We also learned that chess, which was traditionally considered to demand sharp intelligence, is possible to program and implement using “brute force” algorithms commonly considered not to be intelligent at all. Even the recent victory of a Watson machine over the best skilled humans in Jeopardy presents a similar occasion when people hardly can believe that (computer) memory combined with quick search algorithms and a bit of elementary logic can outperform humans. From AI we learned that life itself is a more intriguing and more complex phenomenon than intelligence. For comparison, the basic timeline of a 4.5 billion year old Earth, with very approximate dates: Three point eight billion years of simple cells (prokaryotes), 3 billion years of photosynthesis, 2 billion years of complex cells (eukaryotes), 1 billion years of multicellular life, 600 million years of simple animals, 550 million years of complex animals, 400 million years of insects and seeds, 300 million years of reptiles, 200 million years of mammals, 150 million years of birds, 65 million years since the non-avian dinosaurs died out, 2.5 million years since the appearance of the genus Homo, 200,000 years of anatomically modern humans. (From Wikipedia, Timeline of evolutionary history of life) Even though this timeline is by no means precise, the evolutionary time needed for the development of different life forms can be used as a measure of the complexity of the change.
Interesting topics remaining to be analyzed are how exactly proposed mechanisms of complex system organization powered by energy produced by metabolism can be modeled (simulated) in practice, what exactly representation is and how it is realized and embodied in a cognizing agent. How do we learn from interactions with the world connected with our memory? What is the role of Bayesian statistics in knowledge generation? What is memory and how did it develop evolutionary? And so on.
But it appears to me that almost equally important and necessary is to unlearn the habit to think in terms of traditional models in the domains where they definitely do not apply.

Acknowledgements

The author would like to thank three anonymous reviewers for their valuable comments and suggestions.

References

  1. Cooper, S.B. Turing’s Titanic machine? Commun. ACM 2012, 55, 74–83. [Google Scholar] [CrossRef]
  2. Robinson, H. Substance. In The Stanford Encyclopedia of Philosophy; Zalta, E.N., Ed.; Stanford University: Stanford, CA, USA, 2009. [Google Scholar]
  3. Bateson, G. Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology; University Of Chicago Press: Chicago, IL, USA, 1972; pp. 448–466. [Google Scholar]
  4. Dodig-Crnkovic, G. Information and computation nets: Investigations into info-computational world. In Information and Computation; Vdm Verlag: Saarbrucken, Germany, 2009; pp. 1–96. [Google Scholar]
  5. Dodig-Crnkovic, G. Info-computationalism and morphological computing of informational structure. In Integral Biomathics; Simeonov, P.L., Smith, L.S., Ehresmann, A.C., Eds.; Springer: Heidelberg, Germany, 2012. [Google Scholar]
  6. Maturana, H.; Varela, F. Autopoiesis and Cognition: The Realization of the Living; D. Reidel Publising Co.: Dordrecht, The Netherlands, 1980. [Google Scholar]
  7. Turner, M.S. Quarks and the cosmos. Science 2007, 315, 59–61. [Google Scholar]
  8. Rosen, R. Life Itself: A Comprehensive Inquiry into the Nature, Origin, and Fabrication of Life; Columbia University Press: New York, NY, USA, 1991. [Google Scholar]
  9. Kampis, G. Self-modifying systems: A model for the constructive origin of information. BioSystems 1996, 38, 119–125. [Google Scholar]
  10. Deacon, T.W. Incomplete Nature: How Mind Emerged from Matter; W. W. Norton Company: New York, NY, USA, 2011. [Google Scholar]
  11. von Baeyer, H. Information: The New Language of Science; Harvard University Press: Cambridge, MA, USA, 2004. [Google Scholar]
  12. Seife, C. Decoding the Universe: How the New Science of Information Is Explaining Everything in the Cosmos, from Our Brains to Black Holes; Viking Adult: New York, NY, USA, 2006. [Google Scholar]
  13. Lloyd, S. Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos, 1st ed; Knopf: New York, NY, USA, 2006. [Google Scholar]
  14. Ladyman, J.; Ross, D.; Spurrett, D.; Collier, J. Everything Must Go: Metaphysics Naturalised; Clarendon Press: Oxford, UK, 2007; pp. 1–368. [Google Scholar]
  15. Vedral, V. Decoding Reality: The Universe as Quantum Information; Oxford University Press: Oxford, UK, 2010; pp. 1–240. [Google Scholar]
  16. Davies, P.; Gregersen, N.H. Information and the Nature of Reality from Physics to Metaphysics; Cambridge University Press: Cambridge, UK, 2010. [Google Scholar]
  17. Krauss, L. A Universe from Nothing: Why There is Something Rather Than Nothing; Free Press: New York, NY, USA, 2012. [Google Scholar]
  18. Chaitin, G. Epistemology as information theory: From Leibniz to Ω. In Computation, Information, Cognition—The Nexus and The Liminal; Dodig-Crnkovic, G., Ed.; Cambridge Scholars Publising: Newcastle, UK, 2007; pp. 2–17. [Google Scholar]
  19. Ulanowicz, R. Towards quantifying a wider reality: Shannon Exonerata. Information 2011, 2, 624–634. [Google Scholar]
  20. Denning, P. Computing is a natural science. Commun. ACM 2007, 50, 13–18. [Google Scholar] [CrossRef] [Green Version]
  21. Rozenberg, G.; Kari, L. The many facets of natural computing. Commun. ACM 2008, 51, 72–83. [Google Scholar]
  22. Dodig-Crnkovic, G. Significance of models of computation from turing model to natural computation. Mind. Mach. 2011, 21, 301–322. [Google Scholar]
  23. Dodig-Crnkovic, G.; Müller, V. A Dialogue concerning two world systems: Info-Computational vs. mechanistic. In Information and Computation; Dodig-Crnkovic, G., Burgin, M., Eds.; World Scientific: Singapore, 2011; pp. 149–184. [Google Scholar]
  24. Abramsky, S. Information, processes and games. In Philosophy of Information; van Benthem, J., Adriaans, P., Eds.; North Holland: Amsterdam, The Netherlands, 2008; pp. 483–549. [Google Scholar]
  25. van Benthem, J. Logical Dynamics of Information and Interaction; Cambridge University Press: Cambridge, UK, 2011; pp. 1–384. [Google Scholar]
  26. Kampis, G. Self-modifying systems: A model for the constructive origin of information. BioSystems 1996, 38, 119–125. [Google Scholar]
  27. Dodig-Crnkovic, G. The Info-computational nature of morphological computing. In Theory and Philosophy of Artificial Intelligence; Müller, V.C., Ed.; Springer: Berlin, Germany, 2012. [Google Scholar]
  28. Dodig-Crnkovic, G. Dynamics of information as natural computation. Information 2011, 2, 460–477. [Google Scholar]
  29. Alonso-Sanz, R. A structurally dynamic cellular automaton with memory. Chaos Soliton. Fractal. 2007, 32, 1285–1295. [Google Scholar]
  30. Evers, F.T.; Raupp, M.S. Building artificial memory to autonomous agents using dynamic and hierarchical finite state machine. In Proceedings of the Computer Animation; IEEE Computer Society: Washington, DC, USA, 2002; p. 164. [Google Scholar]
  31. Prigogine, I. From Being to Becoming: Time and Complexity in the Physical Sciences; W. H. Freeman: San Francisco, CA, USA, 1980. [Google Scholar]
  32. Kauffman, S. TheOrigins of Order: Self-Organization and Selection in Evolution; Oxford University Press: Oxford, UK, 1993. [Google Scholar]
  33. Kauffman, S.; Logan, R.; Este, R.; Goebel, R.; Hobill, D.; Shmulevich, I. Propagating organization: An enquiry. Biol. Phil. 2008, 23, 27–45. [Google Scholar]
  34. Ulanowicz, R.; Hannon, B. Life and the production of entropy. Proc. Royal Soc. Lond. B 1987, 232, 181–192. [Google Scholar]
  35. Salthe, S. The natural philosophy of entropy. SEED 2002, 2, 29–41. [Google Scholar]
  36. Maturana, H. Biology of Cognition; Defense Technical Information Center: Fort Belvoir, VA, USA, 1970. [Google Scholar]
  37. Prigogine, I.; Stengers, I. Order out of Chaos: Man’s New Dialogue with Nature; Flamingo: London, UK, 1984. [Google Scholar]
  38. Matsuno, K. How can quantum mechanics of material evolution be possible? Symmetry and symmetry-breaking in protobiological evolution. BioSystems 1985, 17, 179–192. [Google Scholar]
  39. Matsuno, K. Protobiology: Physical Basis of Biology; CRC Press: Boca Raton, FL, USA, 1989. [Google Scholar]
  40. Rössler, O. Endophysics: The World as an Interface; World Scientific: Singapore, 1998. [Google Scholar]
  41. Collier, J. Hierarchical dynamical information system with a focus on biology. Entropy 2003, 5, 100–124. [Google Scholar]
  42. Thompson, E. Mind in Life; Harvard University Press: Cambridge, MA, USA, 2010. [Google Scholar]
  43. Salthe, S. Development (and evolution) of the universe. Found. Sci. 2010, 15, 357–367. [Google Scholar]
  44. Minsky, M. The Society of Mind; Simon and Schuster: New York, NY, USA, 1986. [Google Scholar]
  45. Kurakin, A. Scale-free flow of life: On the biology, economics, and physics of the cell. Theor. Biol. Med. Model. 2009, 6, 6–28. [Google Scholar] [CrossRef]
  46. Kurakin, A. The self-organizing fractal theory as a universal discovery method: The phenomenon of life. Theor. Biol. Med. Model. 2011, 8, 4:1–4:66. [Google Scholar]
  47. Flake, G.W. The Computational Beauty of Nature: Computer Explorations of Fractals, Chaos, Complex Systems, and Adaptation; MIT Press: Cambridge, MA, USA, 1998. [Google Scholar]
  48. Juarrero, A. Dynamics in Action: Intentional Behavior as a Complex System; MIT Press: Cambridge, MA, USA, 1999. [Google Scholar]
  49. Wolfram, S. A New Kind of Science; Wolfram Media: Champaign, IL, USA, 2002. [Google Scholar]
  50. Epstein, J.M. Generative Social Science: Studies in Agent-Based Computational Modeling; Princeton University Press: Princeton, NJ, USA, 2007. [Google Scholar]
  51. Juarrero, A. Complex Dynamical Systems and the Problems of Identity. Emergence 2002, 4, 94–104. [Google Scholar]
  52. Juarrero, A. Dynamics in action: Intentional behavior as a Complex system. Emergence 2000, 2, 24–57. [Google Scholar]
  53. Deacon, T. Shannon-Botzmann-Darwin: Redefining information. Part 1. Cogn. Semiotic 2007, 1, 123–148. [Google Scholar]
  54. Brenner, J. Logic in Reality; Springer: Dordrecht, The Netherlands, 2008. [Google Scholar]
  55. Rosen, R. Anticipatory Systems—Philosophical, Mathematical and Methodological Foundations; Pergamon Press: New York, NY, USA, 1985. [Google Scholar]
  56. Tegmark, M. We’re Not Insignificant After All. Edge World Question Center: Seattle, WA, USA, 2007. Available online: http://www.edge.org/q2007/q07_7.html (accessed on 25 April 2012).

Share and Cite

MDPI and ACS Style

Dodig Crnkovic, G. Physical Computation as Dynamics of Form that Glues Everything Together. Information 2012, 3, 204-218. https://doi.org/10.3390/info3020204

AMA Style

Dodig Crnkovic G. Physical Computation as Dynamics of Form that Glues Everything Together. Information. 2012; 3(2):204-218. https://doi.org/10.3390/info3020204

Chicago/Turabian Style

Dodig Crnkovic, Gordana. 2012. "Physical Computation as Dynamics of Form that Glues Everything Together" Information 3, no. 2: 204-218. https://doi.org/10.3390/info3020204

Article Metrics

Back to TopTop