Next Issue
Volume 14, November
Previous Issue
Volume 14, September
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 14, Issue 10 (October 2012) – 11 articles , Pages 1813-2035

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
587 KiB  
Article
Accelerating Universe and the Scalar-Tensor Theory
by Yasunori Fujii
Entropy 2012, 14(10), 1997-2035; https://doi.org/10.3390/e14101997 - 19 Oct 2012
Cited by 3 | Viewed by 7292
Abstract
To understand the accelerating universe discovered observationally in 1998, we develop the scalar-tensor theory of gravitation originally due to Jordan, extended only minimally. The unique role of the conformal transformation and frames is discussed particularly from a physical point of view. We show [...] Read more.
To understand the accelerating universe discovered observationally in 1998, we develop the scalar-tensor theory of gravitation originally due to Jordan, extended only minimally. The unique role of the conformal transformation and frames is discussed particularly from a physical point of view. We show the theory to provide us with a simple and natural way of understanding the core of the measurements, Λobs ∼ t0−2 for the observed values of the cosmological constant and today’s age of the universe both expressed in the Planckian units. According to this scenario of a decaying cosmological constant, Λobs is this small only because we are old, not because we fine-tune the parameters. It also follows that the scalar field is simply the pseudo Nambu–Goldstone boson of broken global scale invariance, based on the way astronomers and astrophysicists measure the expansion of the universe in reference to the microscopic length units. A rather phenomenological trapping mechanism is assumed for the scalar field around the epoch of mini-inflation as observed, still maintaining the unmistakable behavior of the scenario stated above. Experimental searches for the scalar field, as light as ∼ 10−9 eV, as part of the dark energy, are also discussed. Full article
(This article belongs to the Special Issue Modified Gravity: From Black Holes Entropy to Current Cosmology)
Show Figures

Figure 1

316 KiB  
Review
Conformal Relativity versus Brans–Dicke and Superstring Theories
by David B. Blaschke and Mariusz P. Dąbrowski
Entropy 2012, 14(10), 1978-1996; https://doi.org/10.3390/e14101978 - 18 Oct 2012
Cited by 8 | Viewed by 6456
Abstract
We show how conformal relativity is related to Brans–Dicke theory and to low-energy-effective superstring theory. Conformal relativity or the Hoyle–Narlikar theory is invariant with respect to conformal transformations of the metric. We show that the conformal relativity action is equivalent to the transformed [...] Read more.
We show how conformal relativity is related to Brans–Dicke theory and to low-energy-effective superstring theory. Conformal relativity or the Hoyle–Narlikar theory is invariant with respect to conformal transformations of the metric. We show that the conformal relativity action is equivalent to the transformed Brans–Dicke action for ω = -3/2 (which is the border between standard scalar field and ghost) in contrast to the reduced (graviton-dilaton) low-energy-effective superstring action which corresponds to the Brans–Dicke action with ω = -1. We show that like in ekpyrotic/cyclic models, the transition through the singularity in conformal cosmology in the string frame takes place in the weak coupling regime. We also find interesting self-duality and duality relations for the graviton-dilaton actions. Full article
(This article belongs to the Special Issue Modified Gravity: From Black Holes Entropy to Current Cosmology)
415 KiB  
Review
Impaired Sulfate Metabolism and Epigenetics: Is There a Link in Autism?
by Samantha Hartzell and Stephanie Seneff
Entropy 2012, 14(10), 1953-1977; https://doi.org/10.3390/e14101953 - 18 Oct 2012
Cited by 19 | Viewed by 24164
Abstract
Autism is a brain disorder involving social, memory, and learning deficits, that normally develops prenatally or early in childhood. Frustratingly, many research dollars have as yet failed to identify the cause of autism. While twin concordance studies indicate a strong genetic component, the [...] Read more.
Autism is a brain disorder involving social, memory, and learning deficits, that normally develops prenatally or early in childhood. Frustratingly, many research dollars have as yet failed to identify the cause of autism. While twin concordance studies indicate a strong genetic component, the alarming rise in the incidence of autism in the last three decades suggests that environmental factors play a key role as well. This dichotomy can be easily explained if we invoke a heritable epigenetic effect as the primary factor. Researchers are just beginning to realize the huge significance of epigenetic effects taking place during gestation in influencing the phenotypical expression. Here, we propose the novel hypothesis that sulfates deficiency in both the mother and the child, brought on mainly by excess exposure to environmental toxins and inadequate sunlight exposure to the skin, leads to widespread hypomethylation in the fetal brain with devastating consequences. We show that many seemingly disparate observations regarding serum markers, neuronal pathologies, and nutritional deficiencies associated with autism can be integrated to support our hypothesis. Full article
(This article belongs to the Special Issue Biosemiotic Entropy: Disorder, Disease, and Mortality)
Show Figures

Figure 1

514 KiB  
Article
Programming Unconventional Computers: Dynamics, Development, Self-Reference
by Susan Stepney
Entropy 2012, 14(10), 1939-1952; https://doi.org/10.3390/e14101939 - 17 Oct 2012
Cited by 21 | Viewed by 7669
Abstract
Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to [...] Read more.
Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference. Full article
Show Figures

Figure 1

1683 KiB  
Article
Infrared Cloaking, Stealth, and the Second Law of Thermodynamics
by Daniel P. Sheehan
Entropy 2012, 14(10), 1915-1938; https://doi.org/10.3390/e14101915 - 15 Oct 2012
Cited by 33 | Viewed by 8443
Abstract
Infrared signature management (IRSM) has been a primary aeronautical concern for over 50 years. Most strategies and technologies are limited by the second law of thermodynamics. In this article, IRSM is considered in light of theoretical developments over the last 15 years that [...] Read more.
Infrared signature management (IRSM) has been a primary aeronautical concern for over 50 years. Most strategies and technologies are limited by the second law of thermodynamics. In this article, IRSM is considered in light of theoretical developments over the last 15 years that have put the absolute status of the second law into doubt and that might open the door to a new class of broadband IR stealth and cloaking techniques. Following a brief overview of IRSM and its current thermodynamic limitations, theoretical and experimental challenges to the second law are reviewed. One proposal is treated in detail: a high power density, solid-state power source to convert thermal energy into electrical or chemical energy. Next, second-law based infrared signature management (SL-IRSM) strategies are considered for two representative military scenarios: an underground installation and a SL-based jet engine. It is found that SL-IRSM could be technologically disruptive across the full spectrum of IRSM modalities, including camouflage, surveillance, night vision, target acquisition, tracking, and homing. Full article
(This article belongs to the Special Issue Advances in Applied Thermodynamics)
Show Figures

Figure 1

349 KiB  
Article
Utilizing the Exergy Concept to Address Environmental Challenges of Electric Systems
by Cornelia A. Bulucea, Marc A. Rosen, Doru A. Nicola, Nikos E. Mastorakis and Carmen A. Bulucea
Entropy 2012, 14(10), 1894-1914; https://doi.org/10.3390/e14101894 - 11 Oct 2012
Cited by 1 | Viewed by 6146
Abstract
Theoretically, the concepts of energy, entropy, exergy and embodied energy are founded in the fields of thermodynamics and physics. Yet, over decades these concepts have been applied in numerous fields of science and engineering, playing a key role in the analysis of processes, [...] Read more.
Theoretically, the concepts of energy, entropy, exergy and embodied energy are founded in the fields of thermodynamics and physics. Yet, over decades these concepts have been applied in numerous fields of science and engineering, playing a key role in the analysis of processes, systems and devices in which energy transfers and energy transformations occur. The research reported here aims to demonstrate, in terms of sustainability, the usefulness of the embodied energy and exergy concepts for analyzing electric devices which convert energy, particularly the electromagnet. This study relies on a dualist view, incorporating technical and environmental dimensions. The information provided by energy assessments is shown to be less useful than that provided by exergy and prone to be misleading. The electromagnet force and torque (representing the driving force of output exergy), accepted as both environmental and technical quantities, are expressed as a function of the electric current and the magnetic field, supporting the view of the necessity of discerning interrelations between science and the environment. This research suggests that a useful step in assessing the viability of electric devices in concert with ecological systems might be to view the magnetic flux density B and the electric current intensity I as environmental parameters. In line with this idea the study encompasses an overview of potential human health risks and effects of extremely low frequency electromagnetic fields (ELF EMFs) caused by the operation of electric systems. It is concluded that exergy has a significant role to play in evaluating and increasing the efficiencies of electrical technologies and systems. This article also aims to demonstrate the need for joint efforts by researchers in electric and environmental engineering, and in medicine and health fields, for enhancing knowledge of the impacts of environmental ELF EMFs on humans and other life forms. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

498 KiB  
Article
Quantum Theory, Namely the Pure and Reversible Theory of Information
by Giulio Chiribella, Giacomo Mauro D’Ariano and Paolo Perinotti
Entropy 2012, 14(10), 1877-1893; https://doi.org/10.3390/e14101877 - 8 Oct 2012
Cited by 35 | Viewed by 8435
Abstract
After more than a century since its birth, Quantum Theory still eludes our understanding. If asked to describe it, we have to resort to abstract and ad hoc principles about complex Hilbert spaces. How is it possible that a fundamental physical theory cannot [...] Read more.
After more than a century since its birth, Quantum Theory still eludes our understanding. If asked to describe it, we have to resort to abstract and ad hoc principles about complex Hilbert spaces. How is it possible that a fundamental physical theory cannot be described using the ordinary language of Physics? Here we offer a contribution to the problem from the angle of Quantum Information, providing a short non-technical presentation of a recent derivation of Quantum Theory from information-theoretic principles. The broad picture emerging from the principles is that Quantum Theory is the only standard theory of information that is compatible with the purity and reversibility of physical processes. Full article
Show Figures

Figure 1

435 KiB  
Article
Maximum-Entropy Method for Evaluating the Slope Stability of Earth Dams
by Chuanqi Li, Wei Wang and Shuai Wang
Entropy 2012, 14(10), 1864-1876; https://doi.org/10.3390/e14101864 - 2 Oct 2012
Cited by 8 | Viewed by 7084
Abstract
The slope stability is a very important problem in geotechnical engineering. This paper presents an approach for slope reliability analysis based on the maximum-entropy method. The key idea is to implement the maximum entropy principle in estimating the probability density function. The performance [...] Read more.
The slope stability is a very important problem in geotechnical engineering. This paper presents an approach for slope reliability analysis based on the maximum-entropy method. The key idea is to implement the maximum entropy principle in estimating the probability density function. The performance function is formulated by the Simplified Bishop’s method to estimate the slope failure probability. The maximum-entropy method is used to estimate the probability density function (PDF) of the performance function subject to the moment constraints. A numerical example is calculated and compared to the Monte Carlo simulation (MCS) and the Advanced First Order Second Moment Method (AFOSM). The results show the accuracy and efficiency of the proposed method. The proposed method should be valuable for performing probabilistic analyses. Full article
Show Figures

Figure 1

249 KiB  
Review
A Survey on Interference Networks: Interference Alignment and Neutralization
by Sang-Woon Jeon and Michael Gastpar
Entropy 2012, 14(10), 1842-1863; https://doi.org/10.3390/e14101842 - 28 Sep 2012
Cited by 27 | Viewed by 6804
Abstract
In recent years, there has been rapid progress on understanding Gaussian networks with multiple unicast connections, and new coding techniques have emerged. The essence of multi-source networks is how to efficiently manage interference that arises from the transmission of other sessions. Classically, interference [...] Read more.
In recent years, there has been rapid progress on understanding Gaussian networks with multiple unicast connections, and new coding techniques have emerged. The essence of multi-source networks is how to efficiently manage interference that arises from the transmission of other sessions. Classically, interference is removed by orthogonalization (in time or frequency). This means that the rate per session drops inversely proportional to the number of sessions, suggesting that interference is a strong limiting factor in such networks. However, recently discovered interference management techniques have led to a paradigm shift that interference might not be quite as detrimental after all. The aim of this paper is to provide a review of these new coding techniques as they apply to the case of time-varying Gaussian networks with multiple unicast connections. Specifically, we review interference alignment and ergodic interference alignment for multi-source single-hop networks and interference neutralization and ergodic interference neutralization for multi-source multi-hop networks. We mainly focus on the “degrees of freedom” perspective and also discuss an approximate capacity characterization. Full article
(This article belongs to the Special Issue Information Theory Applied to Communications and Networking)
Show Figures

Graphical abstract

698 KiB  
Article
On Extracting Probability Distribution Information from Time Series
by Andres M. Kowalski, Maria Teresa Martin, Angelo Plastino and George Judge
Entropy 2012, 14(10), 1829-1841; https://doi.org/10.3390/e14101829 - 28 Sep 2012
Cited by 18 | Viewed by 6437
Abstract
Time-series (TS) are employed in a variety of academic disciplines. In this paper we focus on extracting probability density functions (PDFs) from TS to gain an insight into the underlying dynamic processes. On discussing this “extraction” problem, we consider two popular approaches that [...] Read more.
Time-series (TS) are employed in a variety of academic disciplines. In this paper we focus on extracting probability density functions (PDFs) from TS to gain an insight into the underlying dynamic processes. On discussing this “extraction” problem, we consider two popular approaches that we identify as histograms and Bandt–Pompe. We use an information-theoretic method to objectively compare the information content of the concomitant PDFs. Full article
Show Figures

Figure 1

342 KiB  
Article
Network Coding for Line Networks with Broadcast Channels
by Gerhard Kramer and Seyed Mohammadsadegh Tabatabaei Yazdi
Entropy 2012, 14(10), 1813-1828; https://doi.org/10.3390/e14101813 - 28 Sep 2012
Viewed by 4820
Abstract
An achievable rate region for line networks with edge and node capacity constraints and broadcast channels (BCs) is derived. The region is shown to be the capacity region if the BCs are orthogonal, deterministic, physically degraded, or packet erasure with one-bit feedback. If [...] Read more.
An achievable rate region for line networks with edge and node capacity constraints and broadcast channels (BCs) is derived. The region is shown to be the capacity region if the BCs are orthogonal, deterministic, physically degraded, or packet erasure with one-bit feedback. If the BCs are physically degraded with additive Gaussian noise then independent Gaussian inputs achieve capacity. Full article
(This article belongs to the Special Issue Information Theory Applied to Communications and Networking)
Show Figures

Graphical abstract

Previous Issue
Next Issue
Back to TopTop