Next Article in Journal
The Magic Power of Information: The Inner Drive of the Development of Information Society
Previous Article in Journal
Physical Information Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

What We Can Discover from Dimensional Analysis of the Information Concept †

School of Religion, Philosophy and Classics, University of KwaZulu-Natal, Durban 4041, South Africa
Presented at the IS4SI 2017 Summit DIGITALISATION FOR A SUSTAINABLE SOCIETY, Gothenburg, Sweden, 12–16 June 2017.
Proceedings 2017, 1(3), 68; https://doi.org/10.3390/IS4SI-2017-03931
Published: 8 June 2017

Abstract

:
Dimensional analysis is a technique used by scientists and engineers to check the rationality of their calculations, but it can also be used to determine the nature of the quantities used. Information is usually measured in bits, or binary digits, but it could be measured using any other base. I will be arguing that, given the possibility of an objective measure of information in terms of asymmetries, and the relation of information to order, Schrὂdinger’s suggestion that negentropy was an appropriate measure should be taken seriously. After clarifying this notion, I use dimensional analysis to show that negentropy has units of degrees of freedom, and that this is a sensible unit of information.

1. Introduction

I will approach the problem of the nature of information from the method of dimensional analysis, a technique long used by scientists and engineers to compare units of physical quantities, mostly make sure that they are being used correctly. From Wikipedia: “analysis using the fact that physical quantities added to or equated with each other must be expressed in terms of the same fundamental quantities (such as mass, length, or time) for inferences to be made about the relations between them”. One of the most basic definitions of information is due to Gregory Bateson [1], who says that it is “a difference that makes a difference”. Reading Donald MacKay [2]. I came to the conclusion that his idea of information was the similar “a distinction that makes a difference”. This suggests that the units of information are differences, or distinctions. The question naturally arises, though, “makes a difference to whom?” This suggests the necessity of an observer, although some authors allow inanimate observers as well, which I prefer to call interactors, or, more generally, all possible interactors. The range of possible interactors with something (an object or a property) can tell us what differences can be made, and consequently what distinctions are intrinsic to the thing. The number of possible distinctions is the amount of information in the thing. Each distinction is grounded in an asymmetry in the thing [3]. The logic of distinctions [4] is binary, and justifies the usual counting of information in bits. Propositional calculus can be derived from the logic of distinctions, and vice versa [4,5], so true and false apply to whether a given distinction holds [4,5]. We can then see the amount of information in something as a string of yes/no (or 1, 0) answers to a series of questions about the thing that completely specify it [6]. This string has a maximally compressed length, and this length is the amount of information, in bits, intrinsic to the thing, with all redundancies removed [7]. Information can be seen as a measure of order, and entropy as disorder, information being the complement of the entropy of a system, as explained below, the system negentropy. If we follow this line dimensional analysis tells us that information has units of degrees of freedom. I will finish with some observations that seem to me to follow from this approach to information.

2. The Asymmetry Principle of Information

My student, Scott Muller [8] used Jaynes Maximum Entropy Principle and his idea of an IGUS, basically a system that interacts with information. Scott generalized this to the intersection of all possible IGUSs to get a unique and objective measure of the information in a system. Following my connection between information formation and symmetry breaking [3]. Scott used group theory to show that asymmetries imply information content. So the sum of the information intrinsic to something is the sum of it asymmetries, irrespective of observer. I am quite convinced of this, and I no longer have much patience with those who argue that information is intrinsically relative, though I accept that transmitted information is relative to an observer’s capacities for observation. Scott’s approach gives us a way to count information in terms of asymmetries, but it doesn’t really tell us what information is. This quantitative justification of intrinsic information, however, leaves the nature of information open.

3. Order and Disorder; Entropy and Negentropy

My method here is to start with the dimensional analysis resulting from identifying information with negentropy as suggested by Schrὂdinger [9], and developed by Brillouin [10]. This might offend those who strongly believe that information must involve meaning, but I doubt that they can come up with a measure of meaning that allows dimensional analysis. Once I have this basic measure, I will apply it to other uses of the information concept. Schrὂdinger was primarily concerned with the order found in biological systems, but we can generalize. Peter Landsberg [11] and others have argued that order and disorder can increase together in an expanding phase space, such as in an expanding universe, or in a growing system. The entropy of the system is a measure of its disorder when it is in a given state. If we were to relax all constraints except those determining the system, then it will have a unique maximal entropy SMAX. The difference between this and the actual entropy, S, is the system negentropy, its order.
Formally, entropy is S = kBlnΩ, where kB is Boltzmann’s constant and Ω is the number of microscopic configurations compatible with the macrostate, assuming all are equally probable (they are not coordinated). Boltzmann’s constant gives the entropy dimensions of energy divided by temperature. In classical thermodynamics, following Clausius, the entropy is defined as ΔS= ∫δQrev/T, which is also energy over temperature. Given that temperature is average kinetic energy per degree of freedom, dimensional analysis gives us energy divided by energy per degree of freedom. So entropy has dimensions of degree of freedom. It might be now obvious that this has something to do with order and disorder. Note that the dimensional analysis makes order and disorder relative to degrees of freedom. With more constraints the amount of either will be decreased. Thus constraints are simplifying both for our analysis of order and our analysis of disorder. This is an expected outcome. Note also that we can look at order and disorder in different sets of degrees of freedom, in which they may behave quite differently.
It should be noted that entropy has a positive value because the logarithm is of a probability or fraction. Information, though, is usually thought of as positive as well, though it is the complement of entropy, such that the negentropy is the difference between the equilibrium entropy (uniquely defined with all constraints released in it ideally, though the equilibrium entropy may be path dependent in actual process away from equilibrium). However, this is just the order in the system, which is typically taken as positive, just as disorder is. I conclude that it is the absolute values that matter, not the signs.
From this I think it is clear that binary digits are the natural units for information. One binary digit has one degree of freedom. Two binary digits have two degrees of freedom. And so on. This is also in line with the idea that distinctions are the basis of information (at least the ones that can make a difference); they are either there or they are not. Furthermore there is the connection to Propositional Logic and the measurement of information as the compressed length of the string of answers to sufficient yes/no questions.
If each asymmetry in a system is a degree of freedom, than the dimensional analysis converges on Scott Muller’s approach. Also note that the complexity of an idea depends (other than obscurity in representing it) on its degrees of freedom. So the dimensional analysis seems to work more generally, at least with ideas. In general a more complex system will have fewer constraints and more degrees of freedom, so it will require more information to describe, tying the degrees of freedom to the message length definitions. I conclude that there is one basic ides of information from physics through to language.

Acknowledgments

This work was partially supported by an HCID grant from the National Research Foundation of South Africa from 2005 to 2020.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Bateson, G. Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology; University of Chicago Press: Chicago, IL, USA, 1972. [Google Scholar]
  2. MacKay, D.M. Information Mechanism and Meaning; MIT Press: Cambridge, MA, USA, 1969. [Google Scholar]
  3. Collier, J. Information originates in symmetry breaking. Symmetry Sci. Cult. 1996, 7, 247–256. [Google Scholar]
  4. Spencer-Brown, G. The Laws of Form; Allen & Unwin: London, UK, 1969. [Google Scholar]
  5. Cull, P.; Frank, W. Flaws of Form. Int. J. Gen. Syst. 1979, 5, 201–211. [Google Scholar] [CrossRef]
  6. Collier, J. Information, causation and computation. In Information and Computation: Essays on Scientific and Philosophical Understanding of Foundations of Information and Computation; Gordana, D.C., Mark, B., Eds.; World Scientific: Singapore, 2012. [Google Scholar]
  7. Chaitin, G.J. Algorithmic Information Theory; Cambridge University Press: Cambridge, UK, 1987. [Google Scholar]
  8. Muller, S.J. Asymmetry: The Foundation of Information; Springer: Berlin, Germany, 2007. [Google Scholar]
  9. Schrὂdinger, E. What Is Life? The Physical Aspect of the Living Cell; Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
  10. Brillouin, L.N. Science and Information Theory; Academic Press: London, UK, 1962. [Google Scholar]
  11. Landsberg, P.T. Entropy and order. In Disequilibrium and Self-Organization; Kilmister, C.W., Ed.; Springer: Dordrecht, The Netherlands, 1986. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Collier, J. What We Can Discover from Dimensional Analysis of the Information Concept. Proceedings 2017, 1, 68. https://doi.org/10.3390/IS4SI-2017-03931

AMA Style

Collier J. What We Can Discover from Dimensional Analysis of the Information Concept. Proceedings. 2017; 1(3):68. https://doi.org/10.3390/IS4SI-2017-03931

Chicago/Turabian Style

Collier, John. 2017. "What We Can Discover from Dimensional Analysis of the Information Concept" Proceedings 1, no. 3: 68. https://doi.org/10.3390/IS4SI-2017-03931

Article Metrics

Back to TopTop