1. Introduction
Urban meteorology can be characterized by variables that, in a first approximation, have a high impact on the dynamics of the boundary layer of the atmosphere: temperature (T), relative humidity (RH) and the magnitude of wind speed (WS) [
1,
2,
3,
4,
5]. Its behavior, measured in six communes of Santiago de Chile in three different periods of 3.25 years (2010/2013, 2017/2020 and 2019/2022), is recorded as time series with 28,463 specific data points for each variable, yielding a total of 1,537,002 data points of urban meteorology, which gives greater robustness to the analysis of the data and their trends [
6,
7,
8,
9,
10,
11]. The same number of measurements were carried out for three pollutants, with a high presence in urban environments and with a strong impact on human health, which are 10 µm particulate matter (PM
10), 2.5 µm particulate matter (PM
2.5) and carbon monoxide (CO) [
12,
13,
14]. These pollutants were also monitored for 3.25 years in the three periods already mentioned and in the same six communes where urban meteorology was measured. The data-recording instruments for the six variables, all with similar technical characteristics (normed and certified according to EPA), were located in the six locations within a basin geomorphology, with a very rough natural bottom, which determines that the recording of measurements will be performed at different heights (between 450 and 800 m) [
15] with respect to sea level.
The behavior of urban meteorology and pollutants near the Earth’s surface is turbulent and involves interactive processes of an irreversible nature. These characteristics make it appropriate to analyze time series from the perspective of chaos theory [
16], proving that all of them are chaotic since the characteristic parameters calculated are in the appropriate ranges for this type of processes—a Lyapunov exponent (λ) greater than 0, a correlation dimension (D
C) less than 5, the Kolmogorov entropy (S
K) must be greater than 0, the Hurst exponent (H) must be greater than 0.5 and less than 1, the Lempel–Ziv complexity (LZ) must be greater than 0, the information loss (<ΔI>) must be less than 0, and the series must have fractal dimension (D) [
17].
Entropy is a variable that allows for studying the disorder associated with long-term processes, which adjusts very well to the chaotic analysis that requires series with a very large amount of data (over 5000 data points due to the requirement of stability in the calculation of the Lyapunov coefficients). Entropy is a fundamental variable of nature that has been investigated and applied in many different areas. But its application to the study of communications, urban dimensioning, fluids, the Earth–atmosphere system, medicine, biology, etc., is relatively recent [
18,
19,
20,
21]. That said, its application to pollutants and the interaction with the atmosphere in the boundary layer is much more current [
22,
23].
Chaotic and predictable systems disagree in that their trajectories do and do not generate, respectively, new information [
24,
25,
26]. The Kolmogorov–Sinai entropy KS (or metric), symbolized by S
K, places an upper bound on the data gain process; KS was developed by Shannon [
27] as the rate of information creation when a chaotic system evolves (Shannon entropy), but not in the field of dynamical systems. KS has been applied to dynamical systems, and Kolmogorov [
28] and Sinai [
29] proved that it does not change under continuous deformations of space (a topological invariant). If the positive Lyapunov exponents [
30,
31] are added, KS is obtained as a lower bound [
32]; they become equal under continuous measurements along unstable directions, which is usual in chaotic flows. KS is connected with traditional thermodynamic entropy, an indicator of the disorder of the system, by measuring the dispersion of neighboring trajectories towards new domains of the spatial state; since the trajectories depend on the initial conditions (IC), two very close points in a spatial state that are separated in time, the unimportant digits involved in the IC become essential to understanding more about the IC, and influence the evolution of the system. KS is measured as the reciprocal of time (inverse iterations) indicating the average rate of detriment of predictability, its inverse being the time expectancy of a prediction. Entropy allows us to classify systems, as follows: (i) ∞, completely random; (ii) 0, periodic or regular; (iii) greater than 0 and (iv) less than ∞, chaotic. For Shannon, information provides what is distinguishable from a configuration, with entropy being the information not yet available. The process can be observed as a reduction of information, given that predictions from an initial state become less precise over time.
A Markov process is like a data source, where one of l symbols arises randomly in a discrete time sequences [
29,
33]. The writing of a newspaper, for example, conforms to a Markov process. The symbols can be viewed as serial measurements. Assuming that the l possible symbols are integers from 0 to l − 1, the occurrence of a given symbol p depends on n − 1 previous symbols, and the Markov process is of order n. The Markov process implies the independence of time averages from the starting time, and they are equal to the ensemble (ergodic) averages. For a process of order n whose symbol is p
n, the sequence including the previous n − 1 symbols (p
1 p
2 … p
n) can be symbolized as the base l portion of n numbers p
1 p
2 … p
n, synthesized as P
n. This number specifies the state of the source (this is not the state of a dynamical system) [
25]. In the Markovian source, in the sense described, numbers are the product of a series of measurements and are symbolized differently in the partition. For a measurement at time t = 0, it indicates that the dynamical state of the system is located within a component of the partition. For time Δt > 0, a new measurement can yield another result of the component of the partition. The sequence of measurements produces a series of symbols. A sequence of numbers obtained by performing a series of measurements can be considered as the symbols emitted by a Markov source. A different symbol is assigned to each of the n elements of the partition induced by the measuring instrument. Therefore, a series of measurements generates a sequence of symbols, whose temporal information rate is determined according to the limit n→∞ of ΔI
n/nΔt = Δl/Δt. KS can be defined as [
29]:
With suitable measuring instruments, recording samples at the best rate, KS is the average amount of new data (information) acquired per sample.
Given the trajectories x(t) = [x
1(t), x
2(t)…x
d(t)] (
Figure 1), the d-dimensional phase space of a dynamical system is divided into boxes of size l
d, with measurements of the state of the system at uniformly spaced times τ. P
i is the cumulative probability that at time t = 0 the system is in box i
0, at t = τ in i
1… and at mτ in i
m. The magnitude K
m [
27] is
This is related to the information necessary to place the system on a specific path that passes through boxes i0 … im with a certain precision. The coefficient k is related to a measurement unit.
The variation K
m+1 − K
m corresponds to the information in cell i
m+1 in which the system will be given was previously in i
0…i
m, and measures the loss of information on the system as time mτ varies to (m + 1) τ. S
K, the Kolmogorov entropy, is the average loss of information as l and τ → 0,
S
K has units of information of bits per second and bits per iteration in the case of a discrete system [
34,
35,
36]. The order of the limit process of (3) is (i) m → ∞, (ii) l → 0 (removing dependency on selected partition (m = number of cells (partitions))) and (iii) τ → 0 for continuous systems.
For two interacting systems, such as the atmospheric system (represented by meteorological variables) and the hourly concentration of pollutants, the speed over time at which the systems make information obsolete can be evaluated. From KS, the predictability horizon of each system is calculated [
25].
The forces of entropic nature have been under discussion and had some applications for some time. In 2009, Verlinde [
37] argued that gravity is an entropic force. Entropic forces are not exotic. For example, when stretching an elastic band, why does it pull back when stretched? A first argument could point out that a stretched elastic band has more energy than an unstretched one. This is an explanation that fits a metal spring. But this is not how rubber works, which is a natural polymer of isoprene (polyisoprene) and an elastomer (elastic polymer). A stretched elastic band has less entropy than an unstretched one, and this can also cause a force (see
Figure 2).
Rubber molecules look like long chains. If not stretched, these chains can coil in many random ways, which means a lot of entropy. By stretching one of these chains, the number of shapes that can be given to it decreases, since it is tensioned in one way. Once that point is reached, a lot of energy is required to stretch the molecule; but everything points to a decrease in its entropy. Thus, the force that originates from a stretched elastic band is an entropic force. But how can changes in energy or entropy generate forces? Various approaches have agreed on the formal structure of the entropic force: one uses entropic pressure, the first and second laws of thermodynamics, and another uses Helmholtz free energy and canonical coordinates.
In the first approximation, using the differential form of the first and second principles of thermodynamics,
Heat (Q) is not a state function, and δQ is used instead of dQ. Physical entropy, S, in its classical form, is defined by the following equation:
Applying δW = PdV, we obtain
This result is independent of the coordinates used, with S depending on the temperature and volume. Differentiating U (V, T) and S (V, T), and using (6), we obtain (
Appendix A),
The other approach considers that one can calculate the pressure of an ideal gas and show that its origin is completely entropic; only the first term on the right-hand side of (7) is nonzero (
Appendix B). For this purpose, it uses the force in canonical coordinates and the Helmholtz energy [
38,
39].
Geometrically, the entropic force, in general, is the tangent to the entropic surface S,
This force is directly proportional to the variation of entropy with position, which indicates that its origin is associated with thermodynamic processes.
Figure 3 presents the tangents to the entropic surface according to the variables x, y and z.
Each value of the Kolmogorov entropy, S
K, is the result of the sum of the positive values of the Lyapunov exponents (bits/h), of each of the time series (each with 28,463 data points) that were used in this research, and which were obtained from measurements of urban meteorology (MV) and pollutants (P) according to the different heights of the measurement locations. They represent an hourly entropic flow, which is also extended to the position derivatives:
The resulting SK, for each period of 3.25 years, shows that the processes are chaotic, both for urban meteorology and for pollutants. The hourly entropic force can be estimated according to the Kolmogorov entropy, which is the result of the irreversible processes of urban meteorology, SK,MV (), and from the Kolmogorov entropy, which is the result of the irreversible processes of the pollutants, SK,P (), where the subscript K indicates its origin in the entropic flow. Entropy is a state function that accounts for the outcome of a process. In this research, SK is the entropy in the unit of time and represents the processes experienced by the chain of data measured at a given height, and it is shown that this entropic flow (localized and hourly maximum) has an associated entropic force that varies according to the height (z = h).
In the case of a curve in the S
K versus h plane, the entropic force is the tangent to the curve, which is differentiated according to the MV and P cases:
KMV and KP are constants of proportionality according to meteorological variables and the pollutant.
4. Discussion
The following thus pertains:
(i) If the entropic forces F < 0, which can be very fast and turbulent (which is compatible with high Kolmogorov turbulence at the surface level) and show energy loss, then it can be restrictive for the diffusion (more difficult or content);
(ii) If the entropic forces F > 0, which can be very fast and turbulent (which is compatible with high Kolmogorov turbulence at the surface level) and show energy gain, then it can be more permissive and contribute to the diffusion.
Figure 5, which depicts the entropies of the meteorological variables versus the height, together with
Figure 8,
Figure 10 and
Figure 12, shows that the entropic forces due to the meteorological variables present a regime that decays as we advance towards the current period, leading to weaker dynamics.
Figure 6, showing the entropies of the contaminants versus the height, together with
Figure 14,
Figure 16 and
Figure 18, shows that the entropic forces due to the contaminants present a regime that favors the persistence of the studied contaminants. This is corroborated by
Figure 20,
Figure 21 and
Figure 22, which compare the dynamics of the meteorological variables and pollutants of the three periods, from the past to the present, considered in the study. These figures, together with
Table 9, show the growing dominance of pollutant dynamics over urban meteorology dynamics, with more negative proportions (
Table 9, period 2017/2020 and 2019/2022), generating a sink-like effect for pollutants.
The nature of entropic forces has been analyzed for many years [
37,
49,
50,
51], but this application to the interactive problem between urban meteorology and pollutants in a basin geomorphology is not common, so it is not easy to find specialized literature for comparative purposes. Atmospheric pollution caused by human activity has acquired an influential role in urban meteorology, altering its historical cycles and behaviors, favoring the formation of extreme events that end up harming human beings themselves [
42,
52]. Air pollution is a direct indicator of urban densification, the use of concrete in buildings, changes in soil roughness due to high-rise buildings, the still-persistent increase in internal combustion vehicles, the elimination and/or piping with concrete of the many natural networks of water courses present in a geographic basin, etc. Among the varied consequences is the presence of irregular thermal flows over time, and of a heterogeneous magnitude that lead to events with a low horizon of predictability but that are repetitive, such as pandemics, heat waves, thermal islands, droughts, declines in relative humidity, etc.
With the methodology applied, these actions can be classified, definitively, as extreme, and imply varied consequences, such as the presence of irregular thermal flows over time and a heterogeneous magnitude, that favor events with a low horizon of predictability but that are repetitive, such as pandemics, waves of heat, thermal islands, droughts, decline in relative humidity, excessive rain in very short periods, etc.
If the squared variance is <x
2>
[
43], given the calculated value of C
K (
Figure 7) for the six communes, the diffusion is anomalous and of a sub-diffusive nature. Heavy tail probability studies carried out on a basin geomorphology (
Figure 25,
Figure 26 and
Figure 27) confirm the lower entropic influence of urban meteorology compared to that of pollutants, and this type of probability is also adapted to the study of the propagation of a pandemic (
Figure 8, [
52]), showing its durability in areas of high urban densification, high pollution and low ventilation, where pollution stagnates for long periods of time.
The magnitude of the average fractal dimensions of the polluting system and the urban meteorology system is lower for all measurement locations in the 2010/2013 period compared to the other two data recording periods for the same locations. The minor fractality of the polluting system favors the entropic dynamics of urban meteorology, which makes it more dominant, as is reflected in
Figure 20. By increasing the average fractal dimension of the polluting system in the 2017/2020 period, its dynamics are mostly seen to be “flattening” the dynamics of the urban meteorology system (
Figure 21). This trend was broken, at a certain level, in the 2019/2022 period that coincided with the coronavirus pandemic and the strong confinement applied in the city of Santiago de Chile, with the consequent drop in the influence of the polluting system. Even so, this polluting dynamic is strong with respect to urban meteorology (
Figure 22). This is confirmed by the average value of the Hurst coefficient for the polluting system compared to that of the urban meteorology system; Hurst is a long-range measure of dependence within a time series. That is, the events of the past influence the events of the future, and there is a “base state” of the polluting system, which, even when applying confinement and reducing activities to a minimum (2019/2022), prevails.