Next Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 7, Issue 1 (March 2005), Pages 1-121

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Open AccessArticle Lagrangian submanifolds generated by the Maximum Entropy principle
Entropy 2005, 7(1), 1-14; doi:10.3390/e7010001
Received: 25 October 2004 / Accepted: 12 January 2005 / Published: 12 January 2005
PDF Full-text (263 KB)
Abstract
We show that the Maximum Entropy principle (E.T. Jaynes, [8]) has a natural description in terms of Morse Families of a Lagrangian submanifold. This geometric approach becomes useful when dealing with the M.E.P. with nonlinear constraints. Examples are presented using the Ising [...] Read more.
We show that the Maximum Entropy principle (E.T. Jaynes, [8]) has a natural description in terms of Morse Families of a Lagrangian submanifold. This geometric approach becomes useful when dealing with the M.E.P. with nonlinear constraints. Examples are presented using the Ising and Potts models of a ferromagnetic material. Full article
Open AccessArticle The entropy of a mixture of probability distributions
Entropy 2005, 7(1), 15-37; doi:10.3390/e7010015
Received: 13 September 2004 / Accepted: 20 January 2005 / Published: 20 January 2005
Cited by 2 | PDF Full-text (330 KB)
Abstract
If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we [...] Read more.
If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary' calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal $\\frac{1}{m+1} + \\frac{1}{m+2} + ... + \\frac{1}{n}$, where n is the number of possible inputs and m the number of possible outcomes of the computation. Full article
Open AccessArticle Numerical Study On Local Entropy Generation In Compressible Flow Through A Suddenly Expanding Pipe
Entropy 2005, 7(1), 38-67; doi:10.3390/e7010038
Received: 8 December 2004 / Accepted: 8 February 2005 / Published: 11 February 2005
Cited by 6 | PDF Full-text (816 KB)
Abstract
This study presents the investigation of the local entropy generation in compressible flow through a suddenly expanding pipe. Air is used as fluid. The air enters into the pipe with a turbulent profile using 1/7 th power law. The simulations are extended [...] Read more.
This study presents the investigation of the local entropy generation in compressible flow through a suddenly expanding pipe. Air is used as fluid. The air enters into the pipe with a turbulent profile using 1/7 th power law. The simulations are extended to include different expansion ratios reduced gradually from 5 to 1. To determine the effects of the mass flux, φ" the ambient heat transfer coefficient, hamb, and the inlet temperature, Tin, on the entropy generation rate, the compressible flow is examined for various cases of these parameters. The flow and temperature fields are computed numerically with the help of the Fluent computational fluid dynamics (CFD) code. In addition to this CFD code, a computer program has been developed to calculate numerically the entropy generation and other thermodynamic parameters by using the results of the calculations performed for the flow and temperature fields. The values of thermodynamic parameters in the sudden expansion. (SE) case are normalized by dividing by their base quantities obtained from the calculations in the uniform cross-section (UC) case. The contraction of the radius of the throat (from 0.05 to 0.01 m) increases significantly the maximum value of the volumetric entropy generation rate, (about 60%) and raises exponentially 11 times the total entropy generation rate with respect to the its base value. The normalized merit number decreases 73% and 40% with the contraction of the cross-section and with the increase of the ambient heat transfer coefficient (from 20 to 100 W/m2-K), respectively, whereas it rises 226% and 43% with the decrease of the maximum mass flux (from 5 to 1 kg/m2-s) and with the increase of the inlet temperature (from 400 to 1000 K), respectively. Consequently, the useful energy transfer rate to irreversibility rate improves as the mass flux decreases and as the inlet temperature increases. Full article
Open AccessArticle The meanings of entropy
Entropy 2005, 7(1), 68-96; doi:10.3390/e7010068
Received: 19 November 2004 / Accepted: 14 February 2005 / Published: 14 February 2005
Cited by 29 | PDF Full-text (227 KB)
Abstract
Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following [...] Read more.
Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following results: Entropy measures lack of information; it also measures information. These two conceptions are complementary. Entropy measures freedom, and this allows a coherent interpretation of entropy formulas and of experimental facts. To associate entropy and disorder implies defining order as absence of freedom. Disorder or agitation is shown to be more appropriately linked with temperature. Full article
Open AccessArticle Physical Premium Principle: A New Way for Insurance Pricing
Entropy 2005, 7(1), 97-107; doi:10.3390/e7010097
Received: 24 November 2004 / Accepted: 22 February 2005 / Published: 28 February 2005
Cited by 6 | PDF Full-text (581 KB)
Abstract
In our previous work we suggested a way for computing the non-life insurance premium. The probable surplus of the insurer company assumed to be distributed according to the canonical ensemble theory. The Esscher premium principle appeared as its special case. The difference [...] Read more.
In our previous work we suggested a way for computing the non-life insurance premium. The probable surplus of the insurer company assumed to be distributed according to the canonical ensemble theory. The Esscher premium principle appeared as its special case. The difference between our method and traditional principles for premium calculation was shown by simulation. Here we construct a theoretical foundation for the main assumption in our method, in this respect we present a new (physical) definition for the economic equilibrium. This approach let us to apply the maximum entropy principle in the economic systems. We also extend our method to deal with the problem of premium calculation for correlated risk categories. Like the Buhlman economic premium principle our method considers the effect of the market on the premium but in a different way. Full article
Open AccessArticle Van der Waals gas as working substance in a Curzon and Ahlborn-Novikov engine
Entropy 2005, 7(1), 108-121; doi:10.3390/e7010108
Received: 8 January 2005 / Accepted: 8 March 2005 / Published: 18 March 2005
Cited by 6 | PDF Full-text (171 KB)
Abstract
Using a van der Waals gas as the working substance the so called Curzon and Ahlborn-Novikov engine is studied. It is shown that some previous results found in the literature of finite time thermodynamics can be written in a more general form, [...] Read more.
Using a van der Waals gas as the working substance the so called Curzon and Ahlborn-Novikov engine is studied. It is shown that some previous results found in the literature of finite time thermodynamics can be written in a more general form, means of this gas and by taking a non linear heat transfer law. Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top