Special Issue "Selected Papers from "FIS 2010 Beijing""

Quicklinks

A special issue of Information (ISSN 2078-2489).

Deadline for manuscript submissions: closed (30 June 2011)

Special Issue Editor

Guest Editor
Prof. Dr. Pedro C. Marijuán (Website)

Grupo de Bioinformación / Bioinformation Group, Instituto Aragonés de Ciencias de la Salud, Centro de Investigación Biomédica de Aragón (CIBA), Avda. San Juan Bosco, 13, planta X, 50009 Zaragoza, Spain
Interests: multidisciplinary research; systems biology; biology & information; scientomics; sensory-motor approach; laughter research; social information; information science; information philosophy

Special Issue Information

Conference website: http://www.sciforum.net/conf/fis2010

Published Papers (16 papers)

View options order results:
result details:
Displaying articles 1-16
Export citation of selected articles as:

Editorial

Jump to: Research, Other

Open AccessEditorial Introduction to the Special Issue on Information: Selected Papers from “FIS 2010 Beijing”
Information 2012, 3(1), 16-20; doi:10.3390/info3010016
Received: 15 December 2011 / Accepted: 30 December 2011 / Published: 4 January 2012
PDF Full-text (72 KB) | HTML Full-text | XML Full-text
Abstract
During the last two decades, a systematic re-examination of the whole information science field has taken place around the FIS—Foundations of Information Science—initiative. With the occasion of its Fourth Conference in Beijing 2010, a group of selected contributors and leading practitioners of [...] Read more.
During the last two decades, a systematic re-examination of the whole information science field has taken place around the FIS—Foundations of Information Science—initiative. With the occasion of its Fourth Conference in Beijing 2010, a group of selected contributors and leading practitioners of those fields have been invited to contribute to this Special Issue. What is the status of information science today? What is the relationship between information and the laws of nature? Is information merely “physical”? What is the difference between information and computation? Has the genomic revolution changed the contemporary views on information and life? And what about the nature of social information? Cogent answers to these questions and to quite many others are attempted in the contributions that follow. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")

Research

Jump to: Editorial, Other

Open AccessArticle Emergence and Evolution of Meaning: The General Definition of Information (GDI) Revisiting Program—Part 2: The Regressive Perspective: Bottom-up
Information 2013, 4(2), 240-261; doi:10.3390/info4020240
Received: 28 February 2013 / Revised: 28 April 2013 / Accepted: 8 May 2013 / Published: 30 May 2013
PDF Full-text (995 KB) | HTML Full-text | XML Full-text
Abstract
In this second part of our inquiry into the emergence and evolution of meaning, the category of meaning is explored from the manifestation of reality in its corresponding level of interaction towards the interpretation of such reality (the first part deals correspondingly [...] Read more.
In this second part of our inquiry into the emergence and evolution of meaning, the category of meaning is explored from the manifestation of reality in its corresponding level of interaction towards the interpretation of such reality (the first part deals correspondingly with an appropriate top-down approach). Based on the physical constraints of manifestation through electromagnetic waves, which constitutes the base of animal vision, we analyze the limits of the meaning-offer of such a manifestation, which allows us, on the one hand, to compare the efficiency of natural evolution in the reception of such meaning-offers; on the other hand, to analyze the conditions of developing agency able to acknowledge the reality underlying its manifestation. Regarding the complexity of such an agency and its related pragmatic response, we distinguish different levels, which allow the development of the General Definition of Information (GDI) properly, with respect to interpretation, as advanced in the first part, throughout nature. As we show at the end, our approach provides new grounds for the Unified Theory of Information (UTI) Program, as well as the possibility for bridging other approaches in the converging fields of information, meaning, computation, and communication. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Figures

Open AccessArticle Emergence and Evolution of Meaning: The General Definition of Information (GDI) Revisiting Program—Part I: The Progressive Perspective: Top-Down
Information 2012, 3(3), 472-503; doi:10.3390/info3030472
Received: 15 June 2012 / Revised: 28 July 2012 / Accepted: 31 July 2012 / Published: 19 September 2012
Cited by 1 | PDF Full-text (290 KB) | HTML Full-text | XML Full-text
Abstract
In this first part of the paper, the category of meaning is traced starting from the origin of the Universe itself as well as its very grounding in pre-geometry (the second part deals with an appropriate bottom-up approach). In contrast to many [...] Read more.
In this first part of the paper, the category of meaning is traced starting from the origin of the Universe itself as well as its very grounding in pre-geometry (the second part deals with an appropriate bottom-up approach). In contrast to many former approaches in the theories of information and also in biosemiotics, we will show that the forms of meaning emerge simultaneously (alongside) with information and energy. Hence, information can be visualized as being always meaningful (in a sense to be explicated) rather than visualizing meaning as a later specification of information within social systems only. This perspective taken has two immediate consequences: (1) We follow the GDI as defined by Floridi, though we modify it somehow as to the aspect of truthfulness. (2) We can conceptually solve Capurro’s trilemma. Hence, what we actually do is to follow the strict (i.e., optimistic) line of UTI in the sense of Hofkirchner’s. While doing this, we treat energy and information as two different categorial aspects of one and the same underlying primordial structure. We thus demonstrate the presently developing convergence of physics, biology, and computer science (as well as the various theories of information) in some detail and draft out a line of argument eventually leading up to the further unification of UTI and biosemiotics. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Epistemic Information in Stratified M-Spaces
Information 2011, 2(4), 697-726; doi:10.3390/info2040697
Received: 15 September 2011 / Revised: 24 November 2011 / Accepted: 1 December 2011 / Published: 16 December 2011
Cited by 1 | PDF Full-text (246 KB) | HTML Full-text | XML Full-text
Abstract
Information is usually related to knowledge. However, the recent development of information theory demonstrated that information is a much broader concept, being actually present in and virtually related to everything. As a result, many unknown types and kinds of information have been [...] Read more.
Information is usually related to knowledge. However, the recent development of information theory demonstrated that information is a much broader concept, being actually present in and virtually related to everything. As a result, many unknown types and kinds of information have been discovered. Nevertheless, information that acts on knowledge, bringing new and updating existing knowledge, is of primary importance to people. It is called epistemic information, which is studied in this paper based on the general theory of information and further developing its mathematical stratum. As a synthetic approach, which reveals the essence of information, organizing and encompassing all main directions in information theory, the general theory of information provides efficient means for such a study. Different types of information dynamics representation use tools of mathematical disciplines such as the theory of categories, functional analysis, mathematical logic and algebra. Here we employ algebraic structures for exploration of information and knowledge dynamics. In Introduction (Section 1), we discuss previous studies of epistemic information. Section 2 gives a compressed description of the parametric phenomenological definition of information in the general theory of information. In Section 3, anthropic information, which is received, exchanged, processed and used by people is singled out and studied based on the Componential Triune Brain model. One of the basic forms of anthropic information called epistemic information, which is related to knowledge, is analyzed in Section 4. Mathematical models of epistemic information are studied in Section 5. In Conclusion, some open problems related to epistemic information are given. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle From Genomics to Scientomics: Expanding the Bioinformation Paradigm
Information 2011, 2(4), 651-671; doi:10.3390/info2040651
Received: 1 September 2011 / Revised: 31 October 2011 / Accepted: 1 November 2011 / Published: 9 November 2011
Cited by 5 | PDF Full-text (473 KB) | HTML Full-text | XML Full-text
Abstract
Contemporary biological research (particularly in systems biology and the “omic” disciplines) is factually answering some of the poignant questions associated with the information concept and the limitations of information theory. Here, rather than emphasizing and persisting on a focalized discussion about the [...] Read more.
Contemporary biological research (particularly in systems biology and the “omic” disciplines) is factually answering some of the poignant questions associated with the information concept and the limitations of information theory. Here, rather than emphasizing and persisting on a focalized discussion about the i-concept, an ampler conception of “informational entities” will be advocated. The way living cells self-produce, interact with their environment, and collectively organize multi-cell systems becomes a paradigmatic case of what such informational entities consist of. Starting with the fundamentals of molecular recognition, and continuing with the basic cellular processes and subsystems, a new interpretation of the global organization of the living cell must be assayed, so that the equivalents of meaning, value, and intelligence will be approached along an emerging “bioinformational” perspective. Further insights on the informational processes of brains, companies, institutions and human societies at large, and even the sciences themselves, could benefit from—and cross-fertilize with—the advancements derived from the informational approach to living systems. The great advantage fuelling the expansion of the bioinformation paradigm is that, today, cellular information processes may be defined almost to completion at the molecular scale (at least in the case of prokaryotic cells). This is not the case, evidently, with nervous systems and the variety of human organizational, cultural, and social developments. Concretely, the crucial evolutionary phenomenon of protein-domain recombination—knowledge recombination—will be analyzed here as a showcase of, and even as a model for, the interdisciplinary and multidisciplinary mixing of the sciences so prevalent in contemporary societies. Scientomics will be proposed as a new research endeavor to assist advancement. Informationally, the “society of enzymes” appears as a forerunner of the “society of neurons”, and even of the “society of individuals”. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Interdisciplinary Research between Theoretical Informatics and the Humanities
Information 2011, 2(3), 546-559; doi:10.3390/info2030546
Received: 26 July 2011 / Revised: 7 August 2011 / Accepted: 2 September 2011 / Published: 16 September 2011
PDF Full-text (139 KB) | HTML Full-text | XML Full-text
Abstract
This paper focuses on the interdisciplinary research between Theoretical Informatics (TI) and the Humanities (philosophy, history, literature, etc.). There are five main sections: 1. A brief introduction to TI and its functions in the aspects of worldview and methodology, 2. An illustration [...] Read more.
This paper focuses on the interdisciplinary research between Theoretical Informatics (TI) and the Humanities (philosophy, history, literature, etc.). There are five main sections: 1. A brief introduction to TI and its functions in the aspects of worldview and methodology, 2. An illustration of the problems associated with dualism as set out by Plato and René Descartes by means of a theoretical model of the mutual contact and interaction between the material world and the information world, 3. An explanation of the historical view of R. G. Collingwood through informationalism, 4. A discussion of the basic concepts for Humanistic Informatics which is under construction, and 5. A proposal of some approach to the new subject in information science. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Information Science: Its Past, Present and Future
Information 2011, 2(3), 510-527; doi:10.3390/info2030510
Received: 2 June 2011 / Revised: 3 August 2011 / Accepted: 10 August 2011 / Published: 23 August 2011
Cited by 1 | PDF Full-text (276 KB) | HTML Full-text | XML Full-text
Abstract
Early in its history and development, there were three types of classical information sciences: computer and information science, library and information science, telecommunications and information science. With the infiltration of the concept of information into various fields, an information discipline community of [...] Read more.
Early in its history and development, there were three types of classical information sciences: computer and information science, library and information science, telecommunications and information science. With the infiltration of the concept of information into various fields, an information discipline community of around 200 members was formed around the sub-fields of information theory or informatics or information science. For such a large community, a systematization, two trends of thought, some perspectives and suggestions are discussed in this paper. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Concept of Information as a Bridge between Mind and Brain
Information 2011, 2(3), 478-509; doi:10.3390/info2030478
Received: 16 May 2011 / Revised: 26 July 2011 / Accepted: 3 August 2011 / Published: 16 August 2011
Cited by 3 | PDF Full-text (314 KB) | HTML Full-text | XML Full-text
Abstract
The article is focused on the special role of the concept of information understood in terms of the one-many categorical opposition in building a bridge between mind and brain. This particular choice of the definition of information allows unification of the main [...] Read more.
The article is focused on the special role of the concept of information understood in terms of the one-many categorical opposition in building a bridge between mind and brain. This particular choice of the definition of information allows unification of the main two manifestations of information implicitly present in literature, the selective and the structural. It is shown that the concept of information formulated this way together with the concept of information integration can be used to explain the unity of conscious experience, and furthermore to resolve several fundamental problems such as understanding the experiential aspect of consciousness without getting into homunculus fallacy, defending free will from mechanistic determinism, and explaining symbolic representation and aesthetical experience. The dual character of selective and structural manifestations opens the way between the orthodox information scientific description of the brain in terms of the former, and description of mind in terms of the latter. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Dynamics of Information as Natural Computation
Information 2011, 2(3), 460-477; doi:10.3390/info2030460
Received: 30 May 2011 / Revised: 13 July 2011 / Accepted: 19 July 2011 / Published: 4 August 2011
Cited by 7 | PDF Full-text (300 KB) | HTML Full-text | XML Full-text
Abstract
Processes considered rendering information dynamics have been studied, among others in: questions and answers, observations, communication, learning, belief revision, logical inference, game-theoretic interactions and computation. This article will put the computational approaches into a broader context of natural computation, where information dynamics [...] Read more.
Processes considered rendering information dynamics have been studied, among others in: questions and answers, observations, communication, learning, belief revision, logical inference, game-theoretic interactions and computation. This article will put the computational approaches into a broader context of natural computation, where information dynamics is not only found in human communication and computational machinery but also in the entire nature. Information is understood as representing the world (reality as an informational web) for a cognizing agent, while information dynamics (information processing, computation) realizes physical laws through which all the changes of informational structures unfold. Computation as it appears in the natural world is more general than the human process of calculation modeled by the Turing machine. Natural computing is epitomized through the interactions of concurrent, in general asynchronous computational processes which are adequately represented by what Abramsky names “the second generation models of computation” [1] which we argue to be the most general representation of information dynamics. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle On Symmetries and the Language of Information
Information 2011, 2(3), 455-459; doi:10.3390/info2030455
Received: 19 May 2011 / Revised: 28 June 2011 / Accepted: 19 July 2011 / Published: 22 July 2011
PDF Full-text (70 KB) | HTML Full-text | XML Full-text
Abstract
Many writings on information mix information on a given system (IS), measurable information content of a given system (IM), and the (also measurable) information content that we communicate among us on a given system (IC). [...] Read more.
Many writings on information mix information on a given system (IS), measurable information content of a given system (IM), and the (also measurable) information content that we communicate among us on a given system (IC). They belong to different levels and different aspects of information. The first (IS) involves everything that one possibly can, at least potentially, know about a system, but will never learn completely. The second (IM) contains quantitative data that one really learns about a system. The third (IC) relates rather to the language (including mathematical) by which we transmit information on the system to one another, rather than to the system itself. The information content of a system (IM —this is what we generally mean by information) may include all (relevant) data on each element of the system. However, we can reduce the quantity of information we need to mediate to each other (IC), if we refer to certain symmetry principles or natural laws which the elements of the given system correspond to. Instead of listing the data for all elements separately, even in a not very extreme case, we can give a short mathematical formula that informs about the data of the individual elements of the system. This abbreviated form of information delivery includes several conventions. These conventions are protocols that we have learnt before, and do not need to be repeated each time in the given community. These conventions include the knowledge that the scientific community accumulated earlier when discovered and formulated the symmetry principle or the law of nature, the language in which those regularities were discovered and formulated, for example, the symmetry principle or the law of nature, the language in which those regularities were formulated and then accepted by the community, and the mathematical marks and abbreviations that are known only for the members of the given scientific community. We do not need to repeat the rules of the convention each time, because the conveyed information includes them, and it is there in our minds behind our communicated data on the information content. I demonstrate this by using two examples, Kepler’s laws, and the law of correspondence between the DNA codons’ triplet structure and the individual amino acids which they encode. The information content of the language by which we communicate the obtained information cannot be identified with the information content of the system that we want to characterize, and moreover, it does not include all the possible information that we could potentially learn about the system. Symmetry principles and natural laws may reduce the information we need to communicate about a system, but we must keep in mind the conventions that we have learnt about the abbreviating mechanism of those principles, laws, and mathematical descriptions. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Naturalizing Information
Information 2011, 2(3), 417-425; doi:10.3390/info2030417
Received: 6 May 2011 / Revised: 20 June 2011 / Accepted: 1 July 2011 / Published: 7 July 2011
PDF Full-text (405 KB) | HTML Full-text | XML Full-text
Abstract
Certain definitions of information can be seen to be compatible with each other if their relationships are properly understood as referring to different levels of organization in a subsumptive hierarchy. The resulting hierarchy, with thermodynamics subsuming information theory, and that in turn [...] Read more.
Certain definitions of information can be seen to be compatible with each other if their relationships are properly understood as referring to different levels of organization in a subsumptive hierarchy. The resulting hierarchy, with thermodynamics subsuming information theory, and that in turn subsuming semiotics, amounts to a naturalizing of the information concept. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Unity-Based Diversity: System Approach to Defining Information
Information 2011, 2(3), 406-416; doi:10.3390/info2030406
Received: 23 May 2011 / Revised: 19 June 2011 / Accepted: 20 June 2011 / Published: 5 July 2011
Cited by 2 | PDF Full-text (55 KB) | HTML Full-text | XML Full-text
Abstract
What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it [...] Read more.
What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it is possible to establish a unified theory of information or not. To answer this question, a system approach to defining information is introduced in this paper. It is proved that the unity of information definitions can be maintained with this approach. As a by-product, an important concept, the information eco-system, was also achieved. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Receptive Openness to a Message and Its Dative—Materialist Origin of Time
Information 2011, 2(3), 383-405; doi:10.3390/info2030383
Received: 31 May 2011 / Revised: 20 June 2011 / Accepted: 22 June 2011 / Published: 1 July 2011
Cited by 1 | PDF Full-text (191 KB) | HTML Full-text | XML Full-text
Abstract
Information precipitates the flow of time from scratch. Information as a noun, equivalent of the transitive verb “inform”, stands out in the contrast between a direct and an indirect object of the verb, that is to say, between the messenger of a [...] Read more.
Information precipitates the flow of time from scratch. Information as a noun, equivalent of the transitive verb “inform”, stands out in the contrast between a direct and an indirect object of the verb, that is to say, between the messenger of a message and its dative. The root of the contrast is sought in the occurrence of the flow of time in the sense that the flow requires both the invariant reference and the dative being subject to something flowing through against the reference. Empirical evidence of the contrast is found in the class identity kept by a molecular aggregate that can constantly exchange the constituent molecular subunits with those of a similar kind available in the neighborhood. The exchange of the subunits derives from the action of pulling-in, originating from the inside of the body holding the class identity. The action of pulling-in that underlies the synthesis of the flow of time empirically in a bottom-up manner originates in the constant update of the present perfect tense in the present progressive tense. The material aggregate preserving the class identity at the cost of the vicissitudes of the constituent individual subunits serves as the dative of information. The unfathomable depth of information is associated with the immense multitude of the messengers in their kinds toward the likely datives having the capacity of receiving them. The bottom line is that being informed is materially being receptive to a flow of substrate, so the information is being embodied by the receptor. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Toward a New Science of Information
Information 2011, 2(2), 372-382; doi:10.3390/info2020372
Received: 4 May 2011 / Revised: 8 June 2011 / Accepted: 9 June 2011 / Published: 16 June 2011
Cited by 1 | PDF Full-text (58 KB) | HTML Full-text | XML Full-text
Abstract
Currently, a Science of Information does not exist. What we have is Information Science that grew out of Library and Documentation Science with the help of Computer Science. The basic understanding of information in Information Science is the Shannon type of “information” [...] Read more.
Currently, a Science of Information does not exist. What we have is Information Science that grew out of Library and Documentation Science with the help of Computer Science. The basic understanding of information in Information Science is the Shannon type of “information” at which numerous criticisms have been levelled so far. The task of an as-yet-to-be-developed Science of Information would be to study the feasibility of, and to advance, approaches toward a more general Theory of Information and toward a common concept of information. What scientific requirements need to be met when trying to develop a Science of Information? What are the aims of a Science of Information? What is the scope of a Science of Information? What tools should a Science of Information make use of? The present paper responds to these questions. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")
Open AccessArticle Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness
Information 2011, 2(2), 277-301; doi:10.3390/info2020277
Received: 8 February 2011 / Revised: 22 March 2011 / Accepted: 23 March 2011 / Published: 4 April 2011
PDF Full-text (346 KB) | HTML Full-text | XML Full-text
Abstract
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz [...] Read more.
This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")

Other

Jump to: Editorial, Research

Open AccessEssay Towards Quantifying a Wider Reality: Shannon Exonerata
Information 2011, 2(4), 624-634; doi:10.3390/info2040624
Received: 14 July 2011 / Revised: 14 September 2011 / Accepted: 26 September 2011 / Published: 25 October 2011
Cited by 8 | PDF Full-text (123 KB) | HTML Full-text | XML Full-text
Abstract
In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis) of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of [...] Read more.
In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis) of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity concerning exactly what is conveyed by the expression. Resolution of widespread confusion is possible by invoking the third law of thermodynamics, which requires that entropy be treated in a relativistic fashion. Doing so parses the Boltzmann expression into separate terms that segregate apophatic entropy from positivist information. Possibly more importantly, the decomposition itself portrays a dialectic-like agonism between constraint and disorder that may provide a more appropriate description of the behavior of living systems than is possible using conventional dynamics. By quantifying the apophatic side of evolution, the Shannon approach to information achieves what no other treatment of the subject affords: It opens the window on a more encompassing perception of reality. Full article
(This article belongs to the Special Issue Selected Papers from "FIS 2010 Beijing")

Journal Contact

MDPI AG
Information Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
information@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Information
Back to Top