Selected Papers from "FIS 2010 Beijing"

A special issue of Information (ISSN 2078-2489).

Deadline for manuscript submissions: closed (30 June 2011) | Viewed by 128152

Special Issue Editor

Grupo de Bioinformación / Bioinformation Group, Instituto Aragonés de Ciencias de la Salud, Centro de Investigación Biomédica de Aragón (CIBA), 50009 Zaragoza, Spain
Interests: multidisciplinary research; systems biology; biology & information; scientomics; sensory-motor approach; laughter research; social information; information science; information philosophy
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Conference website: http://www.sciforum.net/conf/fis2010

Published Papers (16 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Other

72 KiB  
Editorial
Introduction to the Special Issue on Information: Selected Papers from “FIS 2010 Beijing”
by Raquel del Moral and Pedro C. Marijuán
Information 2012, 3(1), 16-20; https://doi.org/10.3390/info3010016 - 04 Jan 2012
Viewed by 5375
Abstract
During the last two decades, a systematic re-examination of the whole information science field has taken place around the FIS—Foundations of Information Science—initiative. With the occasion of its Fourth Conference in Beijing 2010, a group of selected contributors and leading practitioners of those [...] Read more.
During the last two decades, a systematic re-examination of the whole information science field has taken place around the FIS—Foundations of Information Science—initiative. With the occasion of its Fourth Conference in Beijing 2010, a group of selected contributors and leading practitioners of those fields have been invited to contribute to this Special Issue. What is the status of information science today? What is the relationship between information and the laws of nature? Is information merely “physical”? What is the difference between information and computation? Has the genomic revolution changed the contemporary views on information and life? And what about the nature of social information? Cogent answers to these questions and to quite many others are attempted in the contributions that follow. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")

Research

Jump to: Editorial, Other

995 KiB  
Article
Emergence and Evolution of Meaning: The General Definition of Information (GDI) Revisiting Program—Part 2: The Regressive Perspective: Bottom-up
by José M. Díaz Nafría and Rainer E. Zimmermann
Information 2013, 4(2), 240-261; https://doi.org/10.3390/info4020240 - 30 May 2013
Cited by 7 | Viewed by 8392
Abstract
In this second part of our inquiry into the emergence and evolution of meaning, the category of meaning is explored from the manifestation of reality in its corresponding level of interaction towards the interpretation of such reality (the first part deals correspondingly with [...] Read more.
In this second part of our inquiry into the emergence and evolution of meaning, the category of meaning is explored from the manifestation of reality in its corresponding level of interaction towards the interpretation of such reality (the first part deals correspondingly with an appropriate top-down approach). Based on the physical constraints of manifestation through electromagnetic waves, which constitutes the base of animal vision, we analyze the limits of the meaning-offer of such a manifestation, which allows us, on the one hand, to compare the efficiency of natural evolution in the reception of such meaning-offers; on the other hand, to analyze the conditions of developing agency able to acknowledge the reality underlying its manifestation. Regarding the complexity of such an agency and its related pragmatic response, we distinguish different levels, which allow the development of the General Definition of Information (GDI) properly, with respect to interpretation, as advanced in the first part, throughout nature. As we show at the end, our approach provides new grounds for the Unified Theory of Information (UTI) Program, as well as the possibility for bridging other approaches in the converging fields of information, meaning, computation, and communication. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

Graphical abstract

290 KiB  
Article
Emergence and Evolution of Meaning: The General Definition of Information (GDI) Revisiting Program—Part I: The Progressive Perspective: Top-Down
by Rainer E. Zimmermann and José M. Díaz Nafría
Information 2012, 3(3), 472-503; https://doi.org/10.3390/info3030472 - 19 Sep 2012
Cited by 5 | Viewed by 9797
Abstract
In this first part of the paper, the category of meaning is traced starting from the origin of the Universe itself as well as its very grounding in pre-geometry (the second part deals with an appropriate bottom-up approach). In contrast to many former [...] Read more.
In this first part of the paper, the category of meaning is traced starting from the origin of the Universe itself as well as its very grounding in pre-geometry (the second part deals with an appropriate bottom-up approach). In contrast to many former approaches in the theories of information and also in biosemiotics, we will show that the forms of meaning emerge simultaneously (alongside) with information and energy. Hence, information can be visualized as being always meaningful (in a sense to be explicated) rather than visualizing meaning as a later specification of information within social systems only. This perspective taken has two immediate consequences: (1) We follow the GDI as defined by Floridi, though we modify it somehow as to the aspect of truthfulness. (2) We can conceptually solve Capurro’s trilemma. Hence, what we actually do is to follow the strict (i.e., optimistic) line of UTI in the sense of Hofkirchner’s. While doing this, we treat energy and information as two different categorial aspects of one and the same underlying primordial structure. We thus demonstrate the presently developing convergence of physics, biology, and computer science (as well as the various theories of information) in some detail and draft out a line of argument eventually leading up to the further unification of UTI and biosemiotics. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
246 KiB  
Article
Epistemic Information in Stratified M-Spaces
by Mark Burgin
Information 2011, 2(4), 697-726; https://doi.org/10.3390/info2040697 - 16 Dec 2011
Cited by 7 | Viewed by 6459
Abstract
Information is usually related to knowledge. However, the recent development of information theory demonstrated that information is a much broader concept, being actually present in and virtually related to everything. As a result, many unknown types and kinds of information have been discovered. [...] Read more.
Information is usually related to knowledge. However, the recent development of information theory demonstrated that information is a much broader concept, being actually present in and virtually related to everything. As a result, many unknown types and kinds of information have been discovered. Nevertheless, information that acts on knowledge, bringing new and updating existing knowledge, is of primary importance to people. It is called epistemic information, which is studied in this paper based on the general theory of information and further developing its mathematical stratum. As a synthetic approach, which reveals the essence of information, organizing and encompassing all main directions in information theory, the general theory of information provides efficient means for such a study. Different types of information dynamics representation use tools of mathematical disciplines such as the theory of categories, functional analysis, mathematical logic and algebra. Here we employ algebraic structures for exploration of information and knowledge dynamics. In Introduction (Section 1), we discuss previous studies of epistemic information. Section 2 gives a compressed description of the parametric phenomenological definition of information in the general theory of information. In Section 3, anthropic information, which is received, exchanged, processed and used by people is singled out and studied based on the Componential Triune Brain model. One of the basic forms of anthropic information called epistemic information, which is related to knowledge, is analyzed in Section 4. Mathematical models of epistemic information are studied in Section 5. In Conclusion, some open problems related to epistemic information are given. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

473 KiB  
Article
From Genomics to Scientomics: Expanding the Bioinformation Paradigm
by Raquel del Moral, Mónica González, Jorge Navarro and Pedro C. Marijuán
Information 2011, 2(4), 651-671; https://doi.org/10.3390/info2040651 - 09 Nov 2011
Cited by 12 | Viewed by 7270
Abstract
Contemporary biological research (particularly in systems biology and the “omic” disciplines) is factually answering some of the poignant questions associated with the information concept and the limitations of information theory. Here, rather than emphasizing and persisting on a focalized discussion about the i-concept, [...] Read more.
Contemporary biological research (particularly in systems biology and the “omic” disciplines) is factually answering some of the poignant questions associated with the information concept and the limitations of information theory. Here, rather than emphasizing and persisting on a focalized discussion about the i-concept, an ampler conception of “informational entities” will be advocated. The way living cells self-produce, interact with their environment, and collectively organize multi-cell systems becomes a paradigmatic case of what such informational entities consist of. Starting with the fundamentals of molecular recognition, and continuing with the basic cellular processes and subsystems, a new interpretation of the global organization of the living cell must be assayed, so that the equivalents of meaning, value, and intelligence will be approached along an emerging “bioinformational” perspective. Further insights on the informational processes of brains, companies, institutions and human societies at large, and even the sciences themselves, could benefit from—and cross-fertilize with—the advancements derived from the informational approach to living systems. The great advantage fuelling the expansion of the bioinformation paradigm is that, today, cellular information processes may be defined almost to completion at the molecular scale (at least in the case of prokaryotic cells). This is not the case, evidently, with nervous systems and the variety of human organizational, cultural, and social developments. Concretely, the crucial evolutionary phenomenon of protein-domain recombination—knowledge recombination—will be analyzed here as a showcase of, and even as a model for, the interdisciplinary and multidisciplinary mixing of the sciences so prevalent in contemporary societies. Scientomics will be proposed as a new research endeavor to assist advancement. Informationally, the “society of enzymes” appears as a forerunner of the “society of neurons”, and even of the “society of individuals”. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

139 KiB  
Article
Interdisciplinary Research between Theoretical Informatics and the Humanities
by Zong-Rong Li, Xiao Zhou and Ai-Jing Tian
Information 2011, 2(3), 546-559; https://doi.org/10.3390/info2030546 - 16 Sep 2011
Viewed by 7428
Abstract
This paper focuses on the interdisciplinary research between Theoretical Informatics (TI) and the Humanities (philosophy, history, literature, etc.). There are five main sections: 1. A brief introduction to TI and its functions in the aspects of worldview and methodology, 2. An illustration of [...] Read more.
This paper focuses on the interdisciplinary research between Theoretical Informatics (TI) and the Humanities (philosophy, history, literature, etc.). There are five main sections: 1. A brief introduction to TI and its functions in the aspects of worldview and methodology, 2. An illustration of the problems associated with dualism as set out by Plato and René Descartes by means of a theoretical model of the mutual contact and interaction between the material world and the information world, 3. An explanation of the historical view of R. G. Collingwood through informationalism, 4. A discussion of the basic concepts for Humanistic Informatics which is under construction, and 5. A proposal of some approach to the new subject in information science. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

276 KiB  
Article
Information Science: Its Past, Present and Future
by Xue-Shan Yan
Information 2011, 2(3), 510-527; https://doi.org/10.3390/info2030510 - 23 Aug 2011
Cited by 13 | Viewed by 21321
Abstract
Early in its history and development, there were three types of classical information sciences: computer and information science, library and information science, telecommunications and information science. With the infiltration of the concept of information into various fields, an information discipline community of around [...] Read more.
Early in its history and development, there were three types of classical information sciences: computer and information science, library and information science, telecommunications and information science. With the infiltration of the concept of information into various fields, an information discipline community of around 200 members was formed around the sub-fields of information theory or informatics or information science. For such a large community, a systematization, two trends of thought, some perspectives and suggestions are discussed in this paper. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

314 KiB  
Article
Concept of Information as a Bridge between Mind and Brain
by Marcin J. Schroeder
Information 2011, 2(3), 478-509; https://doi.org/10.3390/info2030478 - 16 Aug 2011
Cited by 14 | Viewed by 7256
Abstract
The article is focused on the special role of the concept of information understood in terms of the one-many categorical opposition in building a bridge between mind and brain. This particular choice of the definition of information allows unification of the main two [...] Read more.
The article is focused on the special role of the concept of information understood in terms of the one-many categorical opposition in building a bridge between mind and brain. This particular choice of the definition of information allows unification of the main two manifestations of information implicitly present in literature, the selective and the structural. It is shown that the concept of information formulated this way together with the concept of information integration can be used to explain the unity of conscious experience, and furthermore to resolve several fundamental problems such as understanding the experiential aspect of consciousness without getting into homunculus fallacy, defending free will from mechanistic determinism, and explaining symbolic representation and aesthetical experience. The dual character of selective and structural manifestations opens the way between the orthodox information scientific description of the brain in terms of the former, and description of mind in terms of the latter. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
300 KiB  
Article
Dynamics of Information as Natural Computation
by Gordana Dodig Crnkovic
Information 2011, 2(3), 460-477; https://doi.org/10.3390/info2030460 - 04 Aug 2011
Cited by 28 | Viewed by 8001
Abstract
Processes considered rendering information dynamics have been studied, among others in: questions and answers, observations, communication, learning, belief revision, logical inference, game-theoretic interactions and computation. This article will put the computational approaches into a broader context of natural computation, where information dynamics is [...] Read more.
Processes considered rendering information dynamics have been studied, among others in: questions and answers, observations, communication, learning, belief revision, logical inference, game-theoretic interactions and computation. This article will put the computational approaches into a broader context of natural computation, where information dynamics is not only found in human communication and computational machinery but also in the entire nature. Information is understood as representing the world (reality as an informational web) for a cognizing agent, while information dynamics (information processing, computation) realizes physical laws through which all the changes of informational structures unfold. Computation as it appears in the natural world is more general than the human process of calculation modeled by the Turing machine. Natural computing is epitomized through the interactions of concurrent, in general asynchronous computational processes which are adequately represented by what Abramsky names “the second generation models of computation” [1] which we argue to be the most general representation of information dynamics. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
70 KiB  
Article
On Symmetries and the Language of Information
by György Darvas
Information 2011, 2(3), 455-459; https://doi.org/10.3390/info2030455 - 22 Jul 2011
Viewed by 5353
Abstract
Many writings on information mix information on a given system (IS), measurable information content of a given system (IM), and the (also measurable) information content that we communicate among us on a given system (IC). They [...] Read more.
Many writings on information mix information on a given system (IS), measurable information content of a given system (IM), and the (also measurable) information content that we communicate among us on a given system (IC). They belong to different levels and different aspects of information. The first (IS) involves everything that one possibly can, at least potentially, know about a system, but will never learn completely. The second (IM) contains quantitative data that one really learns about a system. The third (IC) relates rather to the language (including mathematical) by which we transmit information on the system to one another, rather than to the system itself. The information content of a system (IM —this is what we generally mean by information) may include all (relevant) data on each element of the system. However, we can reduce the quantity of information we need to mediate to each other (IC), if we refer to certain symmetry principles or natural laws which the elements of the given system correspond to. Instead of listing the data for all elements separately, even in a not very extreme case, we can give a short mathematical formula that informs about the data of the individual elements of the system. This abbreviated form of information delivery includes several conventions. These conventions are protocols that we have learnt before, and do not need to be repeated each time in the given community. These conventions include the knowledge that the scientific community accumulated earlier when discovered and formulated the symmetry principle or the law of nature, the language in which those regularities were discovered and formulated, for example, the symmetry principle or the law of nature, the language in which those regularities were formulated and then accepted by the community, and the mathematical marks and abbreviations that are known only for the members of the given scientific community. We do not need to repeat the rules of the convention each time, because the conveyed information includes them, and it is there in our minds behind our communicated data on the information content. I demonstrate this by using two examples, Kepler’s laws, and the law of correspondence between the DNA codons’ triplet structure and the individual amino acids which they encode. The information content of the language by which we communicate the obtained information cannot be identified with the information content of the system that we want to characterize, and moreover, it does not include all the possible information that we could potentially learn about the system. Symmetry principles and natural laws may reduce the information we need to communicate about a system, but we must keep in mind the conventions that we have learnt about the abbreviating mechanism of those principles, laws, and mathematical descriptions. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
405 KiB  
Article
Naturalizing Information
by Stanley N. Salthe
Information 2011, 2(3), 417-425; https://doi.org/10.3390/info2030417 - 07 Jul 2011
Cited by 8 | Viewed by 8760
Abstract
Certain definitions of information can be seen to be compatible with each other if their relationships are properly understood as referring to different levels of organization in a subsumptive hierarchy. The resulting hierarchy, with thermodynamics subsuming information theory, and that in turn subsuming [...] Read more.
Certain definitions of information can be seen to be compatible with each other if their relationships are properly understood as referring to different levels of organization in a subsumptive hierarchy. The resulting hierarchy, with thermodynamics subsuming information theory, and that in turn subsuming semiotics, amounts to a naturalizing of the information concept. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

55 KiB  
Article
Unity-Based Diversity: System Approach to Defining Information
by Yixin Zhong
Information 2011, 2(3), 406-416; https://doi.org/10.3390/info2030406 - 05 Jul 2011
Cited by 6 | Viewed by 6131
Abstract
What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it is [...] Read more.
What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it is possible to establish a unified theory of information or not. To answer this question, a system approach to defining information is introduced in this paper. It is proved that the unity of information definitions can be maintained with this approach. As a by-product, an important concept, the information eco-system, was also achieved. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

191 KiB  
Article
Receptive Openness to a Message and Its Dative—Materialist Origin of Time
by Koichiro Matsuno
Information 2011, 2(3), 383-405; https://doi.org/10.3390/info2030383 - 01 Jul 2011
Cited by 3 | Viewed by 5626
Abstract
Information precipitates the flow of time from scratch. Information as a noun, equivalent of the transitive verb “inform”, stands out in the contrast between a direct and an indirect object of the verb, that is to say, between the messenger of a message [...] Read more.
Information precipitates the flow of time from scratch. Information as a noun, equivalent of the transitive verb “inform”, stands out in the contrast between a direct and an indirect object of the verb, that is to say, between the messenger of a message and its dative. The root of the contrast is sought in the occurrence of the flow of time in the sense that the flow requires both the invariant reference and the dative being subject to something flowing through against the reference. Empirical evidence of the contrast is found in the class identity kept by a molecular aggregate that can constantly exchange the constituent molecular subunits with those of a similar kind available in the neighborhood. The exchange of the subunits derives from the action of pulling-in, originating from the inside of the body holding the class identity. The action of pulling-in that underlies the synthesis of the flow of time empirically in a bottom-up manner originates in the constant update of the present perfect tense in the present progressive tense. The material aggregate preserving the class identity at the cost of the vicissitudes of the constituent individual subunits serves as the dative of information. The unfathomable depth of information is associated with the immense multitude of the messengers in their kinds toward the likely datives having the capacity of receiving them. The bottom line is that being informed is materially being receptive to a flow of substrate, so the information is being embodied by the receptor. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
58 KiB  
Article
Toward a New Science of Information
by Wolfgang Hofkirchner
Information 2011, 2(2), 372-382; https://doi.org/10.3390/info2020372 - 16 Jun 2011
Cited by 5 | Viewed by 6569
Abstract
Currently, a Science of Information does not exist. What we have is Information Science that grew out of Library and Documentation Science with the help of Computer Science. The basic understanding of information in Information Science is the Shannon type of “information” at [...] Read more.
Currently, a Science of Information does not exist. What we have is Information Science that grew out of Library and Documentation Science with the help of Computer Science. The basic understanding of information in Information Science is the Shannon type of “information” at which numerous criticisms have been levelled so far. The task of an as-yet-to-be-developed Science of Information would be to study the feasibility of, and to advance, approaches toward a more general Theory of Information and toward a common concept of information. What scientific requirements need to be met when trying to develop a Science of Information? What are the aims of a Science of Information? What is the scope of a Science of Information? What tools should a Science of Information make use of? The present paper responds to these questions. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
346 KiB  
Article
Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness
by Julio Michael Stern
Information 2011, 2(2), 277-301; https://doi.org/10.3390/info2020277 - 04 Apr 2011
Cited by 3 | Viewed by 7457
Abstract
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von [...] Read more.
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Show Figures

Other

Jump to: Editorial, Research

123 KiB  
Essay
Towards Quantifying a Wider Reality: Shannon Exonerata
by Robert E. Ulanowicz
Information 2011, 2(4), 624-634; https://doi.org/10.3390/info2040624 - 25 Oct 2011
Cited by 23 | Viewed by 6184
Abstract
In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis) of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity [...] Read more.
In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis) of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity concerning exactly what is conveyed by the expression. Resolution of widespread confusion is possible by invoking the third law of thermodynamics, which requires that entropy be treated in a relativistic fashion. Doing so parses the Boltzmann expression into separate terms that segregate apophatic entropy from positivist information. Possibly more importantly, the decomposition itself portrays a dialectic-like agonism between constraint and disorder that may provide a more appropriate description of the behavior of living systems than is possible using conventional dynamics. By quantifying the apophatic side of evolution, the Shannon approach to information achieves what no other treatment of the subject affords: It opens the window on a more encompassing perception of reality. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Back to TopTop