Next Article in Journal
Physical Premium Principle: A New Way for Insurance Pricing
Previous Article in Journal
Numerical Study On Local Entropy Generation In Compressible Flow Through A Suddenly Expanding Pipe
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The meanings of entropy

by
Jean-Bernard Brissaud
Lab/UFR High Energy Physics, Physics Department, Faculty of Sciences, Rabat, Morocco
Entropy 2005, 7(1), 68-96; https://doi.org/10.3390/e7010068
Submission received: 19 November 2004 / Accepted: 14 February 2005 / Published: 14 February 2005

Abstract

Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following results: Entropy measures lack of information; it also measures information. These two conceptions are complementary. Entropy measures freedom, and this allows a coherent interpretation of entropy formulas and of experimental facts. To associate entropy and disorder implies defining order as absence of freedom. Disorder or agitation is shown to be more appropriately linked with temperature.
Keywords: entropy; freedom; information; disorder entropy; freedom; information; disorder

Share and Cite

MDPI and ACS Style

Brissaud, J.-B. The meanings of entropy. Entropy 2005, 7, 68-96. https://doi.org/10.3390/e7010068

AMA Style

Brissaud J-B. The meanings of entropy. Entropy. 2005; 7(1):68-96. https://doi.org/10.3390/e7010068

Chicago/Turabian Style

Brissaud, Jean-Bernard. 2005. "The meanings of entropy" Entropy 7, no. 1: 68-96. https://doi.org/10.3390/e7010068

Article Metrics

Back to TopTop