Next Article in Journal
Assessing the Performance of SuDS in the Mitigation of Urban Flooding: The Industrial Area of Sesto Ulteriano (MI)
Previous Article in Journal
Information Processing by Symmetric Inductive Turing Machines
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Conference Theoretical Information Studies Berkeley 2019 †

Department of Mathematics, University of California, Los Angeles, 520 Portola Plaza, Los Angeles, CA 90095, USA
Conference Theoretical Information Studies, Berkeley, California, USA, 2–6 June 2019.
Proceedings 2020, 47(1), 2; https://doi.org/10.3390/proceedings2020047002
Published: 13 May 2020
(This article belongs to the Proceedings of IS4SI 2019 Summit)

Abstract

:
This paper has a two-fold goal. In the first part, the area of theoretical information studies is described. In second part, contributions to the conference “Theoretical Information Studies” (Berkeley 2019) are characterized.

1. Information Studies as a Field of Research and Domain of Knowledge

Information studies encompass the following basic fields:
  • Information science,
  • Information philosophy,
  • Information methodology, and
  • Information logic.
There are also fields directly associated with information studies:
Computer science is associated with information studies because it studies information processing by technical devices.
Linguistics is associated with information studies it studies information processing by natural and artificial languages.
Semiotics is associated with information studies it studies symbolic information processing.
Psychology is associated with information studies it studies information processing by people.
Pedagogy is associated with information studies it studies information transmission and knowledge acquisition.
Artificial intelligence is associated with information studies because it studies information processing by artificial systems.
There are also fields which are conceptually associated with information studies, including:
Information-oriented and information-based physical theories (cf. for example, [1,2,3,4,5,6,7,8,9]).
Information-oriented and information-based biological theories (cf. for example, [10,11,12,13,14,15,16,17,18]).
Information-oriented and information-based economic theories (cf. for example, [19,20,21,22,23,24,25,26]).
Information-oriented and information-based sociological theories.
Information-oriented and information-based anthropological theories.
Information science as any natural science consists of three components:
  • Theoretical information science,
  • Experimental information science, and
  • Applied information science.
Information theory constitutes the basic part of theoretical information science, which also includes subtheoretical fragments, i.e., research ingredients, which are situated on a lower level than the level of theory. Although many think that information theory is Shannon’s statistical information theory [27], which was originally called communication theory, now there are many information theories: statistical information theory [27], algorithmic information theory [28], semantic information theory [29], pragmatic information theory [11], economic information theory [30], qualitative information theory [31], and the general theory of information [32, 33]. There is a possibility that the elaboration of a unified theory of information would eliminate the necessity of all other information theories. However, this is not true. First, a unified theory of information has already been created and it is called the general theory of information [32]. Second, the role of this general theory is not the elimination of other special information theories but the unification of the scientific field called theoretical information studies in general and theoretical information science in particular. Other information theories become subtheories of the general theory of information, but they are necessary because they bring forth the more exact representation and study of information in different specific fields, provide more adequate means for solving specific problems, and allow a better understanding of the role of information in different spheres of the world.
Examples from mathematics and physics show that the existence of a general theory coexists with the active functioning of its subtheories. For instance, in mathematics the theory of groups contains subtheories—such as the theory of Abelian groups, the theory of nilpotent groups, and the theory of ordered groups—which are actively growing. In physics, Newton’s dynamics coexists with Kepler’s theory of planetary movement, which is a subtheory of Newton’s dynamics.
The goal of experimental information science is to study information using physical and mental experiments. It is necessary to acknowledge that experimental information science is not sufficiently developed. Some experiments are conducted only by statistical information theory and algorithmic information theory. For instance, Shannon devised an experiment aimed at determining the amount of statistical information (entropy) in a given letter in English.
It is possible to classify applied or practical information science by the domain of its application. This gives us three classes:
Theoretical applications, i.e., applications of information science to the theoretical areas of science and humanities, such as physics, biology, or economics.
Philosophical applications, i.e., applications of information science to philosophical teachings or methodological systems.
Practical applications, i.e., applications of information science in practical areas such as engineering, linguistics, or cryptography.
These considerations allow us to discern the following areas of theoretical applications:
Application of information theory to theoretical physics.
Application of information theory to theoretical computer science.
Application of information theory to the theory of complexity.
Application of information theory to mathematics.
Application of information theory to theoretical linguistics.
Applications of information theory in economics.
Application of information theory to pedagogy.
Applications of information theory in sociology.
Applications of information theory in anthropology.
Application of information theory to semiotics.

2. Presentations at the Conference

The conference of Theoretical Information Studies was organized as a part of the IS4SI Summit 2019 in Berkeley, California, USA. It included on-site and off-site presentations. The goal of the conference was to bring together academics and researchers, providing beneficial conditions for presenting and discussing recent achievements and problems of information theory and its applications in theoretical issues in science and humanities. The participants came from nine countries and four continents.
Here, we publish the proceedings of the selected presentations from the conference of Theoretical Information Studies with innovative results and ideas, some of which are at the forefront of theoretical information science.
In their presentation, Gianfranco Basti, Antonio Capolupo, and Giuseppe Vitiello explored semantic information in quantum mechanics and quantum field theory.
In his presentation, Paul Benioff described how local mathematics and number scaling provide information about physical and geometric systems. His work is aimed at the unification of physics and mathematics.
In his presentation, Mark Burgin explicated and discussed two classes of information—potential and impact information. This approach is based on the inherent relations between physics and information theory.
In his presentation, Jaime F. Cárdenas-García presented his work with Timothy Ireland on analyzing and further developing Bateson’s approach to information theory.
In their presentation, Gordana Dodig-Crnkovic and Mark Burgin analyzed the presentation of information in recent books, delineating the emergent academic field of the study of information.
In his presentation, Wolfgang Hofkirchner explored interrelations between people and artificial systems such as computers and computer networks.
In another presentation, Wolfgang Hofkirchner discussed the origin of system thinking.
In their presentation, Stefan Leijnen and Fjodor van Veen presented the taxonomy of neural networks, which have become an important and very popular tool in information processing with the advance of deep learning.
In his presentation, Vladimir Lerner analyzed the natural origin of information and the natural encoding of information.
In his presentation, Rafal Maciag described and analyzed the ontological basis of knowledge in the theory of discursive space and its consequences.
In his presentation, Marcin Schroeder described and scrutinized a variety of important theoretical tools for the study of information, such as equivalence and cryptomorphism.
In his presentation, Paul Zellweger described the Branching Data Model, which he developed for data management and information visualization using the theory of named sets.
In his presentation, Yixin Zhong analyzed concepts of information in the context of artificial intelligence research, specifying two types of information—ontological and epistemological information.
In his presentation, Rao Mikkilineni described the advancement of computation beyond the Church–Turing Thesis boundaries, which was based on structural machines, digital genes, and digital neurons being developed in the context of future AI.
In her presentation, Annette Grathoff explored the structural and kinetic components of physical information from an evolutionary perspective.
In their presentation, Mark Burgin and Kees De Vey Mestdagh discussed their research on the complexity of the information components of legal systems and processes.
In their presentation, Mark Burgin and Gordana Dodig-Crnković introduced and explored the novel scheme of modeling computing devices by information operators.
In his presentation, Jose Monserrat Neto described and analyzed the birth and evolution of symbolic information.
In another presentation, Jose Monserrat Neto considered the diversification of symbolic systems.
In his presentation, Mark Burgin outlined the problem-oriented foundations of intelligence in the context of superintelligence, suggesting a mathematical approach to the formalization of intelligence.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Frieden, R.B. Physics from Fisher Information; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar]
  2. Harmuth, H.F. Information Theory Applied to Space-Time Physics, World Scientific, Singapore, 1992.
  3. Chiribella, G.; D’Ariano, G.M.; Perinotti, P. Informational derivation of quantum theory. Phys. Rev. A 2011, 84, 012311. [Google Scholar] [CrossRef]
  4. Chiribella, G.; D’Ariano, G.M.; Perinotti, P. Quantum theory, namely the pure and reversible theory of information. Entropy 2012, 14, 1877–1893. [Google Scholar] [CrossRef]
  5. D’Ariano, G.M. Physics without physics: The power of information-theoretical principles. Int. J. Theory Phys. 2017, 56, 97–128. [Google Scholar] [CrossRef]
  6. Fuchs, C. Quantum mechanics as quantum information (and only a little more). arXiv 2002, arXiv:quant-ph/0205039v1. [Google Scholar]
  7. Goyal, P. Information physics—Towards a new conception of physical reality. Information 2012, 3, 567–594. [Google Scholar] [CrossRef]
  8. Lee, J.-W. Quantum mechanics emerges from information theory applied to causal horizons. Found. Phys. 2011, 41, 744–753. [Google Scholar] [CrossRef]
  9. Clifton, R.; Bub, J.; Halvorson, H. Characterizing quantum theory in terms of information-theoretic constraints. Found. Phys. 2003, 33, 1561–1591. [Google Scholar] [CrossRef]
  10. Pattee, H.H. The Physics of Autonomous Biological Information. Biol. Theory: Integr. Dev. Evol. Cogn. 2006, 1, 224–226. [Google Scholar] [CrossRef]
  11. Weinberger, E.D. A theory of pragmatic information and its application to the quasi-species model of biological evolution. Biosystems 2002, 66, 105–119. [Google Scholar] [CrossRef]
  12. Loewenstein, W.R. The Touchstone of Life: Molecular Information, Cell Communication, and the Foundation of Life; Oxford University Press: Oxford, UK; New York, NY, USA, 1999. [Google Scholar]
  13. Barbieri, M. The Organic Codes: An Introduction to Semantic Biology; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  14. Burgin, M.; Simon, I. Information, Energy, and Evolution, electronic ed.; Preprint in Biology 2359, Cogprints; 2001. Available online: http://cogprints.ecs.soton.ac.uk (accessed on 12 May 2020).
  15. Lerner, V.S. Information Macrodynamic Approach for Modeling in Biology and Medicine. J. Biol. Syst. 1997, 5, 215–264. [Google Scholar] [CrossRef]
  16. Lerner, V.S. Macrodynamic cooperative complexity of biosystems. J. Biol. Syst. 2006, 14, 131–68. [Google Scholar] [CrossRef]
  17. Lerner, V.S.; Talyanker, M.I. Informational Geometry for Biology and Environmental Applications. In Proceedings of the International Conference on Environmental Modeling and Simulation, San Diego, CA, USA, 10-14 January 2000; pp. 79–84. [Google Scholar]
  18. Reinagel, P. Information theory in the brain. Curr. Biol. 2000, 10, 542–544. [Google Scholar] [CrossRef]
  19. Arrow, K.J. Collected Papers of Kenneth J. Arrow; Volume 4: The Economics of Information; Harvard University Pres: Cambridge, MA, USA, 1984. [Google Scholar]
  20. Boisot, M.H. Knowledge Assets: Securing Competitive Advantage in the Information Economy; Oxford University Press: Oxford, UK, 1998. [Google Scholar]
  21. Laffont, J.J. The Economics of Uncertainty and Information; Bonin, J.P., Bonin, H., Eds.; Translator; MIT Press: Cambridge, MA, USA, 1989. [Google Scholar]
  22. Lerner, V.S. An elementary information macrodynamic model of market economic system. J. Inf. Sci. 2006, 176, 3556–3590. [Google Scholar] [CrossRef]
  23. Marschak, J. Towards an economic theory of organization and information. In Decision Processes; New York, NY, USA, John Wiley, 1954; pp. 187–220.
  24. Marschak, J. Economics of information systems. J. Am. Stat. Assoc. 1971, 66, 192–219. [Google Scholar] [CrossRef]
  25. Marschak, J. Economic information, decision, and prediction, Selected essays. Vol. II, Economics of information and organization. In Theory and Decision Library; D. Reidel Publishing Co.: Dordrecht, The Netherlands; Boston, MA, USA, 1980. [Google Scholar]
  26. Stigler, G. J. The economics of information. J. Polit. Econ. 1961, 69, 213–225. [Google Scholar] [CrossRef]
  27. Shannon, C.E. The Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  28. Chaitin, G.J. Algorithmic information theory. IBM J. Res. Dev. 1977, 21, 350–359. [Google Scholar] [CrossRef]
  29. Bar-Hillel, Y.; Carnap, R. Semantic Information. Br. J. Philos. Sci. 1958, 4, 147–157. [Google Scholar] [CrossRef]
  30. Arrow, K.J.; Marschak, J.; Harris, T. Economics of information and organization: Optimal inventory policy. Econometrica 1951, 19, 250–272. [Google Scholar] [CrossRef]
  31. Mazur, M. Jakosciowa Teoria Informacji; PAN: Warszawa, Poland, 1970. (In Polish) [Google Scholar]
  32. Burgin, M. Theory of Information: Fundamentality, Diversity and Unification; World Scientific: New York, NY, USA; London, UK; Singapore, 2010. [Google Scholar]
  33. Burgin, M. The General Theory of Information as a Unifying Factor for Information Studies: The noble eight-fold path. Proceedings 2017, 1, 164. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Burgin, M. Conference Theoretical Information Studies Berkeley 2019. Proceedings 2020, 47, 2. https://doi.org/10.3390/proceedings2020047002

AMA Style

Burgin M. Conference Theoretical Information Studies Berkeley 2019. Proceedings. 2020; 47(1):2. https://doi.org/10.3390/proceedings2020047002

Chicago/Turabian Style

Burgin, Mark. 2020. "Conference Theoretical Information Studies Berkeley 2019" Proceedings 47, no. 1: 2. https://doi.org/10.3390/proceedings2020047002

APA Style

Burgin, M. (2020). Conference Theoretical Information Studies Berkeley 2019. Proceedings, 47(1), 2. https://doi.org/10.3390/proceedings2020047002

Article Metrics

Back to TopTop