Next Issue
Volume 5, September
Previous Issue
Volume 5, March
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 5, Issue 2 (June 2003) – 13 articles , Pages 61-251

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
295 KiB  
Article
Phase Space Cell in Nonextensive Classical Systems
by Francesco Quarati and Piero Quarati
Entropy 2003, 5(2), 239-251; https://doi.org/10.3390/e5020239 - 30 Jun 2003
Cited by 6 | Viewed by 7574
Abstract
We calculate the phase space volume Ω occupied by a nonextensive system of N classical particles described by an equilibrium (or steady-state, or long-term stationary state of a nonequilibrium system) distribution function, which slightly deviates from Maxwell-Boltzmann (MB) distribution in the high energy [...] Read more.
We calculate the phase space volume Ω occupied by a nonextensive system of N classical particles described by an equilibrium (or steady-state, or long-term stationary state of a nonequilibrium system) distribution function, which slightly deviates from Maxwell-Boltzmann (MB) distribution in the high energy tail. We explicitly require that the number of accessible microstates does not change respect to the extensive MB case. We also derive, within a classical scheme, an analytical expression of the elementary cell that can be seen as a macrocell, different from the third power of Planck constant. Thermodynamic quantities like entropy, chemical potential and free energy of a classical ideal gas, depending on elementary cell, are evaluated. Considering the fractional deviation from MB distribution we can deduce a physical meaning of the nonextensive parameter q of the Tsallis nonextensive thermostatistics in terms of particle correlation functions (valid at least in the case, discussed in this work, of small deviations from MB standard case). Full article
302 KiB  
Article
On the Measure Entropy of Additive Cellular Automata f
by Hasan Akın
Entropy 2003, 5(2), 233-238; https://doi.org/10.3390/e5020233 - 30 Jun 2003
Cited by 12 | Viewed by 6774
Abstract
We show that for an additive one-dimensional cellular automata f on space of all doubly infinitive sequences with values in a finite set S = {0, 1, 2, ..., r-1}, determined by an additive automaton rule [equation] (mod r), and a f [...] Read more.
We show that for an additive one-dimensional cellular automata f on space of all doubly infinitive sequences with values in a finite set S = {0, 1, 2, ..., r-1}, determined by an additive automaton rule [equation] (mod r), and a f-invariant uniform Bernoulli measure μ, the measure-theoretic entropy of the additive one-dimensional cellular automata f with respect to μ is equal to hμ (f) = 2klog r, where k ≥ 1, r-1∈S. We also show that the uniform Bernoulli measure is a measure of maximal entropy for additive one-dimensional cellular automata f. Full article
277 KiB  
Article
Extensive Generalization of Statistical Mechanics Based on Incomplete Information Theory
by Qiuping A. Wang
Entropy 2003, 5(2), 220-232; https://doi.org/10.3390/e5020220 - 30 Jun 2003
Cited by 49 | Viewed by 7933
Abstract
Statistical mechanics is generalized on the basis of an additive information theory for incomplete probability distributions. The incomplete normalization is used to obtain generalized entropy . The concomitant incomplete statistical mechanics is applied to some physical systems in order to show the effect [...] Read more.
Statistical mechanics is generalized on the basis of an additive information theory for incomplete probability distributions. The incomplete normalization is used to obtain generalized entropy . The concomitant incomplete statistical mechanics is applied to some physical systems in order to show the effect of the incompleteness of information. It is shown that this extensive generalized statistics can be useful for the correlated electron systems in weak coupling regime. Full article
Show Figures

Figure 1

314 KiB  
Article
The Informational Patterns of Laughter
by José A. Bea and Pedro C. Marijuán
Entropy 2003, 5(2), 205-213; https://doi.org/10.3390/e5020205 - 30 Jun 2003
Cited by 15 | Viewed by 8679
Abstract
Laughter is one of the most characteristic -and enigmatic- communicational traits of human individuals. Its analysis has to take into account a variety of emotional, social, cognitive, and communicational factors densely interconnected. In this article we study laughter just as an auditive signal [...] Read more.
Laughter is one of the most characteristic -and enigmatic- communicational traits of human individuals. Its analysis has to take into account a variety of emotional, social, cognitive, and communicational factors densely interconnected. In this article we study laughter just as an auditive signal (as a 'neutral' information carrier), and we compare its structure with the regular traits of linguistic signals. In the experimental records of human laughter that we have performed, the most noticeable trait is the disorder content of frequencies. In comparison with the sonograms of vowels, the information content of which appears as a characteristic, regular function of the first vibration modes of the dynamic system formed, for each vowel, by the vocal cords and the accompanying resonance of the vocalization apparatus, the sonograms of laughter are highly irregular. In the episodes of laughter, a highly random content in frequencies appears, reason why it cannot be considered as a genuine codification of patterned information like in linguistic signals. In order to numerically gauge the disorder content of laughter frequencies, we have performed several "entropy" measures of the spectra -trying to unambiguously identify spontaneous laughter from "faked", articulated laughter. Interestingly, Shannon's entropy (the most natural candidate) performs rather poorly. Full article
Show Figures

Figure 1

224 KiB  
Article
Information and Meaning
by Christophe Menant
Entropy 2003, 5(2), 193-204; https://doi.org/10.3390/e5020193 - 30 Jun 2003
Cited by 17 | Viewed by 8002
Abstract
We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and [...] Read more.
We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used. Full article
Show Figures

Figure 1

421 KiB  
Article
Information Processing in Auto-regulated Systems
by Karl Javorszky
Entropy 2003, 5(2), 161-192; https://doi.org/10.3390/e5020161 - 30 Jun 2003
Cited by 2 | Viewed by 6681
Abstract
We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common [...] Read more.
We present a model of information processing which is based on two concurrent ways of describing the world, where a description in one of the languages limits the possibilities for realisations in the other language. The two describing dimensions appear in our common sense as dichotomies of perspectives: subjective - objective; diversity - similarity; individual - collective. We abstract from the subjective connotations and treat the test theoretical case of an interval on which several concurrent categories can be introduced. We investigate multidimensional partitions as potential carriers of information and compare their efficiency to that of sequenced carriers. We regard the same assembly once as a contemporary collection, once as a longitudinal sequence and find promising inroads towards understanding information processing by auto-regulated systems. Information is understood to point out that what is the case from among alternatives, which could be the case. We have translated these ideas into logical operations on the set of natural numbers and have found two equivalence points on N where matches between sequential and commutative ways of presenting a state of the world can agree in a stable fashion: a flip-flop mechanism is envisioned. By following this new approach, a mathematical treatment of some poignant biomathematical problems is allowed. Also, the concepts presented in this treatise may well have relevance and applications within the information processing and the theory of language fields. Full article
Show Figures

Figure 1

232 KiB  
Article
Information Theory: a Multifaceted Model of Information
by Mark Burgin
Entropy 2003, 5(2), 146-160; https://doi.org/10.3390/e5020146 - 30 Jun 2003
Cited by 25 | Viewed by 9792
Abstract
A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a [...] Read more.
A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological) and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by Shannon’s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases. Full article
289 KiB  
Article
From Data to Semantic Information
by Luciano Floridi
Entropy 2003, 5(2), 125-145; https://doi.org/10.3390/e5020125 - 30 Jun 2003
Cited by 12 | Viewed by 9373
Abstract
There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute [...] Read more.
There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information. Full article
224 KiB  
Article
Information Seen as Part of the Development of Living Intelligence: the Five-Leveled Cybersemiotic Framework for FIS
by Soren Brier
Entropy 2003, 5(2), 88-99; https://doi.org/10.3390/e5020088 - 30 Jun 2003
Cited by 11 | Viewed by 6322
Abstract
It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The [...] Read more.
It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics. Full article
(This article belongs to the Special Issue Recent Advances in Entanglement and Quantum Information Theory)
206 KiB  
Article
Living Systems are Dynamically Stable by Computing Themselves at the Quantum Level
by Abir U. Igamberdiev
Entropy 2003, 5(2), 76-87; https://doi.org/10.3390/e5020076 - 30 Jun 2003
Cited by 5 | Viewed by 6402
Abstract
The smallest details of living systems are molecular devices that operate between the classical and quantum levels, i.e. between the potential dimension (microscale) and the actual three-dimensional space (macroscale). They realize non-demolition quantum measurements in which time appears as a mesoscale dimension separating [...] Read more.
The smallest details of living systems are molecular devices that operate between the classical and quantum levels, i.e. between the potential dimension (microscale) and the actual three-dimensional space (macroscale). They realize non-demolition quantum measurements in which time appears as a mesoscale dimension separating contradictory statements in the course of actualization. These smaller devices form larger devices (macromolecular complexes), up to living body. The quantum device possesses its own potential internal quantum state (IQS), which is maintained for prolonged time via error-correction being a reflection over this state. Decoherence-free IQS can exhibit itself by a creative generation of iteration limits in the real world. To avoid a collapse of the quantum information in the process of correcting errors, it is possible to make a partial measurement that extracts only the error-information and leaves the encoded state untouched. In natural quantum computers, which are living systems, the error-correction is internal. It is a result of reflection, given as a sort of a subjective process allotting optimal limits of iteration. The IQS resembles the properties of a quasi-particle, which interacts with the surround, applying decoherence commands to it. In this framework, enzymes are molecular automata of the extremal quantum computer, the set of which maintains stable highly ordered coherent state, and genome represents a concatenation of error-correcting codes into a single reflective set. Biological systems, being autopoietic in physical space, control quantum measurements in the physical universe. The biological evolution is really a functional evolution of measurement constraints in which limits of iteration are established possessing criteria of perfection and having selective values. Full article
258 KiB  
Article
Hierarchical Dynamical Information Systems With a Focus on Biology
by John Collier
Entropy 2003, 5(2), 100-124; https://doi.org/10.3390/e5020100 - 26 Jun 2003
Cited by 30 | Viewed by 7185
Abstract
A system of a number of relatively stable units that can combine more or less freely to form somewhat less stable structures has a capacity to carry information in a more or less arbitrary way. I call such a system a physical information [...] Read more.
A system of a number of relatively stable units that can combine more or less freely to form somewhat less stable structures has a capacity to carry information in a more or less arbitrary way. I call such a system a physical information system if its properties are dynamically specified. All physical information systems have certain general dynamical properties. DNA can form such a system, but so can, to a lesser degree, RNA, proteins, cells and cellular subsystems, various immune system elements, organisms in populations and in ecosystems, as well as other higher-level phenomena. These systems are hierarchical structures with respect to the expression of lower level information at higher levels. This allows a distinction between macro and microstates within the system, with resulting statistical (entropy driven) dynamics, including the possibility of self-organization, system bifurcation, and the formation of higher levels of information expression. Although lower-level information is expressed in an information hierarchy, this in itself is not sufficient for reference, function, or meaning. Nonetheless, the expression of information is central to the realization of all of these. 'Biological information' is thus ambiguous between syntactic information in a hierarchical modular system, and functional information. However, the dynamics of hierarchical physical information systems is of interest to the study of how functional information might be embodied physically. I will address 1) how to tighten the relative terms in the characterizations of 'information system' and 'informational hierarchy' above, 2) how to distinguish between components of an information system combining to form more complex informational modules and the expression of information, 3) some aspects of the dynamics of such systems that are of biological interest, 4) why information expression in such systems is not sufficient for functional information, and 5) what further might be required for functional information. Full article
251 KiB  
Article
Mediated Character of Economic Interactions
by Josip Stepanic, Jr., Igor Bertovic and Josip Kasac
Entropy 2003, 5(2), 61-75; https://doi.org/10.3390/e5020061 - 11 Jun 2003
Cited by 2 | Viewed by 6698
Abstract
Economic interactions are conducted between economic agents - individuals and collectives, through exchange of natural or artificial entities - goods, services and money, in a myriad of combinations. In this article we adopt a microscopic point of view, concentrate on the exchanged entities, [...] Read more.
Economic interactions are conducted between economic agents - individuals and collectives, through exchange of natural or artificial entities - goods, services and money, in a myriad of combinations. In this article we adopt a microscopic point of view, concentrate on the exchanged entities, and extract their relevant attributes as seen from structurally simple economic processes. Following that, we approach the interpretation of the economic interactions with their mediated character emphasized. Mediators of the interaction are locally available environment units. They are locally recognized and appropriately interpreted in a given value set as goods and money. The overall intensity of economic interactions considered is related to mediators' spatial and temporal characteristics. Extracted characteristics of mediators and economic processes are compacted in the set of formal rules. The approach is connected with similar approaches in economy and physics. Full article
Show Figures

Figure 1

151 KiB  
Article
Foundations of Information Science Selected papers from FIS 2002
by Pedro C. Marijuán
Entropy 2003, 5(2), 214-219; https://doi.org/10.3390/e5020214 - 30 Jan 2003
Cited by 8 | Viewed by 6063
Abstract
The accompanying papers in the first issue of Entropy, volume 5, 2003 were presented at the electronic conference on Foundations of Information Science FIS 2002 (http://www.mdpi.net/fis2002/). The running title of this FIS e-conference was THE NATURE OF INFORMATION: CONCEPTIONS, MISCONCEPTIONS, AND PARADOXES. It [...] Read more.
The accompanying papers in the first issue of Entropy, volume 5, 2003 were presented at the electronic conference on Foundations of Information Science FIS 2002 (http://www.mdpi.net/fis2002/). The running title of this FIS e-conference was THE NATURE OF INFORMATION: CONCEPTIONS, MISCONCEPTIONS, AND PARADOXES. It was held on the Internet from 6 to 10 May 2002, and was followed by a series of discussions –structured as focused sessions– which took place in the net from 10 May 2002 until 31 January 2003 (more than 400 messages were exchanged, see: http://fis.iguw.tuwien.ac.at/mailings/). This Introduction will briefly survey the problems around the concept of information, will present the central ideas of the FIS initiative, and will contrast some of the basic differences between information and mechanics (reductionism). Full article
Previous Issue
Next Issue
Back to TopTop