Next Article in Journal
Density Functional Theory and Materials Modeling at Atomistic Length Scales
Next Article in Special Issue
Chemical Reactivity as Described by Quantum Chemical Methods
Previous Article in Journal / Special Issue
Special Issue on Application of Density Functional Theory in Chemical Reactions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Applications of the Information Theory to Problems of Molecular Electronic Structure and Chemical Reactivity

by
Roman F. Nalewajski
Faculty of Chemistry, Jagiellonian University, R. Ingardena 3, 30-060 Cracow, Poland
Int. J. Mol. Sci. 2002, 3(4), 237-259; https://doi.org/10.3390/i3040237
Submission received: 28 September 2001 / Accepted: 7 January 2002 / Published: 25 April 2002
(This article belongs to the Special Issue Application of Density Functional Theory)

Abstract

:
Recent studies on applications of the information theoretic concepts to molecular systems are reviewed. This survey covers the information theory basis of the Hirshfeld partitioning of molecular electron densities, its generalization to many electron probabilities, the local information distance analysis of molecular charge distributions, the charge transfer descriptors of the donor-acceptor reactive systems, the elements of a “thermodynamic” description of molecular charge displacements, both “vertical” (between molecular fragments for the fixed overall density) and “horizontal” (involving different molecular densities), with the entropic representation description provided by the information theory. The average uncertainty measures of bond multiplicities in molecular “communication” systems are also briefly summarized. After an overview of alternative indicators of the information distance (entropy deficiency, missing information) between probability distributions the properties of the “stockholder” densities, which minimize the entropy deficiency relative to the promolecule reference, are summarized. In particular, the surprisal analysis of molecular densities is advocated as an attractive information-theoretic tool in the electronic structure theory, supplementary to the familiar density difference diagrams. The subsystem information density equalization rules satisfied by the Hirshfeld molecular fragments are emphasized: the local values of alternative information distance densities of subsystems are equal to the corresponding global value, characterizing the molecule as a whole. These local measures of the information content are semi-quantitatively related to the molecular density difference function. In the density functional theory the effective external potentials of molecular fragments are defined, for which the subsystem densities are the ground-state densities. The nature of the energetic and “entropic” equilibrium conditions is reexamined and  the entropy representation forces driving the charge transfer in molecular systems are introduced. The latter combine the familiar Fukui functions of subsystems with the information densities, the entropy representation “intensive” conjugates of the subsystem electron densities, and are shown to exactly vanish for the “stockholder” charge distribution. The proportionality relations between charge response characteristics of reactants, e.g., the Fukui functions, are derived. They are shown to follow from the minimum entropy deficiency principles formulated in terms of both the subsystems electron densities and Fukui functions, respectively.

1. Introduction

In chemistry an understanding of the electronic structure of molecules and reactive systems comes from transforming the experimental or computational results into statements in terms of chemical concepts, such as atoms-in-molecules (AIM), the building blocks of molecules, their collections, e.g., the functional groups, and the chemical bonds representing the AIM “connectivities”. The bonded atoms are known to be only slightly changed relative to the corresponding free atoms. The collection of the constituent free atoms, shifted to the actual positions R in a molecule, determines the “promolecule”, which constitutes the standard reference state for extracting changes in the electron distribution due to the formation of chemical bonds, represented by the familiar density difference function,
Δρ(r) = Δρ(r; R) = ρ(r; R) − ρ0(r; R),
where ρ(r) = ρ(r; R) and ρ0(r) = ρ0(r; R) = ∑α ρα0(r; R) stand for the molecular and promolecular electron densities, respectively, with the latter being determined by the free atom densities {ρα0(r) = ρα0(r; R)}.
The information theory (IT) [1,2,3] provides both the entropic measures of the information distance (similarity) between the compared distributions of electrons in a given molecular system and the associated promolecule [2,4,5,6,7,8,9,10,11], respectively, and a convenient device, the information entropy variational principle, for assimilating in the optimized electron density (or probability) distribution the physical information contained in the constraints and the appropriate references, in the most unbiased manner. This theoretical framework can also be used to extract the entropic (information) characteristics of the probability distributions of simultaneously finding several electrons in a molecule and the associated promolecule, respectively [7,10,11], both continuous and discrete, e.g., in the AIM or molecular fragment resolutions. Such an approach can also facilitate a development of the information-theoretic indices of the chemical bond multiplicities [10,11]. One can attempt to formulate within the IT a thermodynamic-like description of molecular systems and their fragments [5], by complementing the familiar energetic variational principles of the wave-function quantum chemistry or the density functional theory (DFT) [12] with the corresponding entropy representation principles from the IT. As it will be argued in the present work, such an approach is vital for extracting chemical concepts from the calculated molecular electron distributions.
In this survey we review recent applications of the information theoretic concepts and principles to typical problems of a chemical interpretation of the electronic structure, including the definition of AIM [4,5,6] at various stages of their reconstruction in a molecular environment, with particular emphasis on the Hirshfeld [13] (“stockholder”) partitioning, and the chemical bond multiplicities [10,11]. We shall also briefly address the surprisal analysis of molecular electron densities [5,8,9], and concepts combining the familiar charge response indices of DFT and the relevant information-distance densities [8,9]. We shall conclude with elements of a more general “thermodynamic” description within the information theory of molecular and reactive systems, including both the “vertical” displacements of the electronic structure (for the constant molecular density) and the “horizontal” transitions from one ground-state density to another [5].

2. Information Distance Measures for Probability Distributions

The Kullback-Leibler (KL) [2a] missing information (entropy deficiency, directed divergence) between the current [p(r)] and reference [p0(r)] normalized probability distributions, ∫ p(r) dr = ∫ p0(r) dr = 1,
ΔSKL[p|p0] = ∫ p(r) log[p(r)/p0(r)] dr∫ p(r) I [p(r)/p0(r)] dr ≥ 0,
where the logarithmic part of the integrand determines the surprisal function I [p(r)/p0(r)], reflects the information content in p relative to that in p0. In other words the functional ΔSKL[p|p0] measures the information “distance” or likeness of both distributions. Notice that its integrand is negative when I[p(r)|p0(r)] < 0, or p(r) < p0(r).
The integrand of the symmetrized entropy deficiency of Kullback (K) [2b],
ΔSK[p, p0] = ΔSKL[p|p0] + ΔSKL[p0|p] = [p(r) − p0(r)] I[p(r)/p0(r)] dr ≥ 0,
called divergence, is always non-negative.
An alternative information distance quantity is defined by Fisher’s (F) [3] referenced entropy for locality, called intrinsic accuracy,
ΔSF[p|p0] = ∫ p(r) {∂I[p(r)/p0(r)]/∂r}2 dr ≥ 0.
It can be easily verified that, as intuitively expected, the minimum (zero) value of all these entropy deficiency measures, ΔS[p|p0] = {ΔSKL[p|p0], ΔSK[p, p0], ΔSF[p|p0]}, obtained in the minimum information distance principle including the global probability normalization constraint, ∫ p(r) dr = 1,
δ{ΔS[p|p0] − λ ∫ p(r) dr} = 0,
is reached when the two distributions are identical, i.e., for p(r) = p0(r) [4].
One also defines the associated missing information quantities using the AIM resolved one-electron probabilities, p = {pi} and p0 = {pi0}, ∑i pi = ∑i pi0 = 1, of finding an electron on the i-th AIM and the i-th free atom of the promolecule, respectively [10,11]:
ΔHKL(p|p0) = ∑i pi log [pi/pi0] ≡ ∑i pi I[pi/pi0] ≥ 0,
ΔHK(p, p0) = ΔHKL(p|p0) + ΔHKL(p0|p) = ∑i [pipi0] I[pi/pi0] ≥ 0.
These probabilities of the discrete atomic description determine the corresponding one-electron Shannon entropies,
H1(p) = − ∑i pi log pi,   H1(p0) = − ∑i pi0 log pi0,
and the corresponding displacement due to the formation of chemical bonds in a molecule:
ΔH1 = H1(p) − H1(p0).
The corresponding two-electron joint probabilities P = {Pi,j}, of simultaneously finding a pair of electrons on atoms i and j in a molecule, and the corresponding promolecule probabilities P0 = {Pi,j0}, where ∑i,j Pi,j = ∑i,j Pi,j0 = 1, ∑i Pi,j = pj, ∑i Pi,j0 = pj0, etc., define the associated displacement of the Shannon two-electron entropy in atomic resolution,
H2(P) = − ∑i,j Pi,j log Pi,j, H2(P0) = − ∑i,j Pi,j0 log Pi,j0, ΔH2 = H2(P) − H2(P0),
the average conditional entropy [14],
Hc(P|p) = − ∑i,j Pi,j log [Pi,j/pi] = − ∑i,j Pi,j log P(j|i)
     = − ∑i,j Pi,j log Pi,j + ∑i pi log pi = H2(P) − H1(p),
and the average mutual information [14],
ΔHKL(P|Pind) = ∑i,j Pi,j log [Pi,j/(pi pj)]
     = ∑i,j Pi,j log P(j|i) − ∑j pj log pj = H1(p) − Hc(P|p) = 2H1(p) − H2(P),
measuring the information distance between the molecular two-electron probabilities and the corresponding distribution of independent electrons: Pind = {pi pj} [10,11]. One also defines a related quantity
ΔHKL(P|PM,P)= ∑i,j Pi,j log [Pi,j/(pi pj0)] = H1(p) + H1(p0) − H2(P)
= H1(p) − Hc(P|p0) = H1(p0) − Hc(P|p),
reflecting the entropy deficiency of the molecular two-electron probabilities in atomic resolution relative to the product of the independent one electron probability schemes, characterizing the promolecular (P) input (I, source) and the molecular (M) output (O, receiver), respectively: PM,P = {pi pj0} [10,11].
The conditional probabilities characterize the “communication” channels [14] in the molecular probability network [10], in which a message of the AIM assignment of electrons is transmitted from the molecular “source” (promolecule) to the molecular “receiver” (molecule). In the molecule this signal is disturbed, compared to the promolecule, by the chemical “noise” reflecting an additional uncertainty in attributing electrons to AIM created by the delocalization of electrons through the network of chemical bonds. Therefore, the one-electron entropies of Eq. (6a) characterize the molecular output (or input),
H1(p) ≡ H(IM) ≡ H(OM),
and the promolecular input,
H1(p0) ≡ H(IP),
probability schemes, respectively. Similarly, the two electron entropy determines the average uncertainty associated with probabilities that two electrons are simultaneously found on the molecular input (I) and output (O):
H2(P) ≡ H(IMOM).
One similarly interprets the remaining average uncertainties. The conditional entropy of Eq. (8) provides the average uncertainty of the molecular output given the molecular input,
Hc(P|p) ≡ H(OM|IM) ≡ H(IM|OM),
while the mutual information measure of Eq. (10) represents the difference
ΔHKL(P|PM,P) ≡ H1(IM) − Hc(OM|IP) ≡ H1(IP) − Hc(OM|IM).
The corresponding information theoretic quantities involving three-electron probabilities in atomic resolution have also been explored within the orbital approximation for model systems [11].
The above (physically dimensionless) uncertainty/information quantities are expressed in bits (a contraction of binary digit), when the logarithm is taken to the base 2: log ≡ log2. If any other base had been chosen, the result would be to multiply the entropy by an appropriate constant, which is equivalent to a scale change. This, to quote Shannon, “merely amounts to a choice of a unit of measure”. In considerations on the information distance variational principles we put, for reasons of simplicity, log ≡ ln.

3. Hirshfeld (“Stockholder”) Subsystems and “Vertical” Displacements of Electronic Structure

Hirshfeld [13] has approached the classical problem of partitioning the known molecular ground-state density ρ(r) into the corresponding AIM densities, ρ(r) ≡ {ρα(r)},
ρ(r) = ∑α ρα(r),
using the common sense assumption, that the AIM participates in ρ(r) (molecular “profit”) in proportion to its share wα0(r) = ρα0(r)/ρ0(r) in ρ0(r) (the promolecule “investment”):
ρα(r) = ραH(r) = wα0(r) ρ(r) ≡ W(r) ρα0(r),  ∑α wα0(r) = 1,
where W(r) = ρ(r)/ρ0(r) = ραH(r)/ρα0(r) ≡ WαH(r), represents the Hirshfeld AIM enhancement factor, common to all “stockholder” AIM.
It has been shown [4] that this division scheme has a sound information theoretic basis, as minimizing the Kullback-Leibler missing information functional of Eq. (2), conveniently formulated in terms of the subsystem electron densities ρ and ρ0 ≡ {ρα0}, instead of the corresponding one-electron probability densities (shape factors), σ(r) = ρ(r)/N and σ0(r) = ρ0(r)/N0, where N = ∫ ρ(r) dr and N0 = ∫ ρ0(r) dr denote the number of electrons in the molecule and promolecule, respectively, ∑α σα(r) = ρ(r)/Nσ(r) and ∑α σα0(r) = ρ0(r)/N0σ0(r):
ΔSKL[ρ| ρ0] = ∑α ∫ ρα(r) ln [ρα(r)/ρα0(r)] dr ≡ ∑α ∫ ρα(r) Iα[ρα(r)/ρα0(r)] dr
≡ ∑α Δsα[ρα; r] dr ≡ ∑α ΔSα[ρα|ρα0]
= N {∑α ∫ σα(r) Iα[σα(r)/σα0(r)] dr + ln (N/N0)} ≡ NSKL[σ |σ0]+ ln (N/N0)}.
It follows from the above expression that for the fixed N and N0 these normalizations of the subsystem electron densities do not affect the optimum solutions of a variational principle involving the Kullback-Leibler entropy deficiency functional. Nalewajski and Parr [4] have demonstrated, that the optimum densities resulting from the minimum entropy deficiency principle including the Lagrange term associated with the local constraint of the exhaustive partitioning of the known molecular density, ∑α ρα(r) = ρ(r),
δ{ΔSKL[ρ|ρ0] − ∫ λ(r) [∑α ρα(r) dr]} = 0,
are the Hirshfeld subsystem densities of Eq. (13), ρ(r) = ρH(r) ≡ {ραH(r)}, for which the value of the information distance between the two sets of densities is determined by the corresponding global value of the entropy deficiency in ρ relative to ρ0:
ΔSKL[ρH|ρ0] = ∫ ρ (r) ln[ρ(r)/ρ0(r)] dr∫ ρ(r) I [ρ(r)/ρ0(r)] dr Δs[ρ; r] dr ≡ ΔS [ρ|ρ0].
The associated minimum entropy deficiency principle in terms of the shape factors, with incorporated Lagrange term of the local constraint ∑α σα(r) = σ(r),
δ{ΔSKL[σ|σ0] − ∫ ν(r) [∑α σα(r)] dr } = 0,
similarly gives:
σα(r) = σαH(r) = [σα0(r)/σ0(r)] σ(r) = W(r) σα0(r),
and thus αH(r) = ραH(r).
The same result is obtained when other information distance measures are used in the minimum entropy deficiency principle of Eq. (15) [5,8,9], e.g., the promolecule referenced Fisher [3] information measure for locality (intrinsic accuracy) of Eq. (4), formulated in terms of subsystem electron densities [5],
ΔSF[ρ|ρ0] = ∑α ∫ ρα(r) {∇Iα[ρα(r)/ρα0(r)]}2 dr,
or the Kullback’s [2b] divergence of Eq. (3) [8,9]:
ΔSK[ρ, ρ0] = ΔSKL[ρ|ρ0] + ΔSKL[ρ0|ρ]
= ∑α [ρα(r) − ρα0(r)] ln [ρα(r)/ρα0(r)] dr ≡ ∑α Δdα[ρα; r] dr ≡ ∑α ΔDα[ρα, ρα0],
ΔSK[ρH, ρ0] = [ρ(r) − ρ0(r)] I[ρ(r)/ρ0(r)] dr Δd[ρ; r] dr ≡ ΔD[ρ, ρ0].
The Hirshfeld AIM exhibit several important properties [5], which make them attractive candidates for the atomic interpretation in chemistry. They preserve as much as possible of the information contained in the electron densities of the free atoms, exhibit a single cusp at the atomic nucleus and decay exponentially at large distances from it [6,8,9]. The bonded “stockholder” atoms in H2 [8,9] reflect the intuitively expected changes due to formation of a single covalent bond: the overall contraction of the AIM electron distribution and its polarization towards the bonding partner.
In DFT [12] such molecular fragments have also been shown to be the effective external potential representable [5]. More specifically, it has been demonstrated that the partial functional derivative with respect to the subsystem density of the non-additive part,
Fn[ρ] = F[ρ] − Fa[ρ],
of the universal Hohenberg-Kohn-Levy functional F[ρ] = F[∑αρα] ≡ F[ρ], where the additive part Fa[ρ] = ∑α F[ρα], determines the embedding correction vαe(r), due to the subsystem chemical environment in a molecule, to the molecular external potential v(r) generated by the nuclei of all constituent atoms:
vαeff(r) = vαeff[ρ; r] = v(r) + {∂Fn[ρ]/∂ρα(r)}β≠αv(r) + vαe(r);
here the subscript β ≠ α denotes the fixed densities of the remaining subsystems.
In other words, each embedded subsystem density can be viewed as representing the separate (free) system defined by the appropriate effective external potential. This observation introduces an important element of causality into the subsystem description. Namely, each manipulation on the molecular fragment densities can now be interpreted as the ground-state response to the concomitant displacement in the effective external potential. Moreover, a non-equilibrium set of subsystem densities can be attributed an effective ground-state (equilibrium) interpretation, which is vital for the thermodynamic-like description of intermediate reconstructions of the electron distributions in molecular processes.
The embedding energy Fn[ρ] determines the effective energy of the subsystem in presence of its molecular environment [5]:
εα[ρ, v] = {∫ v(r) ρα(r) dr + F[ρα]} + Fn[ρ] ≡ Ev[ρα] + Fn[ρ],
where Ev[ρα] is the energy due to the ρα alone. Indeed, for such an external potential the subsystem density satisfies the global-like, ground-state Euler equation:
μα[ρ, v] ≡ {∂εα[ρ, v]/∂ρα(r)}v,β≠α = vαeff(r) + δF[ρα]/δρα(r)
  = μ[ρ] ≡ {∂Ev[ρ]/∂ρ(r)}v = v(r) + δF[ρ]/δρ(r).
Thus, as in all mutually opened subsystems giving rise to a given molecular ground-state density, ρρ, the stockholder fragments exhibit the subsystem chemical potentials μ ≡ {μα[ρ, v] ≡ μα} = μ1, where 1 = (1, 1, …, 1), equalized at the global value μ[ρ] ≡ μ in accordance with the Sanderson principle of the electronegativity equalization (EE) [15]:
μα = μβ = … = μ.
It should be emphasized, however, that the above EE criterion does not distinguish within the fixed molecular density one set of mutually open subsystems from another. Only the complementary entropic description of the information theory identifies the Hirshfeld subsystems as the equilibrium pieces of the molecular density [4,5]. The IT entropy representation within the subsystem resolution identifies the stockholder AIM as the stable, equilibrium subsystems [5], for which the non-additive part of the entropy deficiency functional,
ΔSn[ρ|ρ0] ≡ ΔS[ρ|ρ0] − ΔS[ρ|ρ0],
exactly vanishes: ΔSn[ρHρ0] = 0.
The effective potentials of Eq. (22) can be determined for any set of well behaving, smooth and continuous fragment densities ρ, not only for the equilibrium ρ = ρH ones, so that trial subsystem densities ρ(r) can always be viewed as the equilibrium, ground-state densities for the corresponding effective external potentials veff[ρ; r] ≡ {vαeff[ρ; r]}, with the one-to-one ground-state mapping
veff[ρ; r] ↔ ρ(r).
Therefore, for a given ground-state molecular density, corresponding to the fixed external potential due to the nuclei, a set of non-equilibrium subsystem densities ρ(r) can be attributed to the unique external potential constraints, veff[ρ; r], which determine ρ(r) through the Euler Eq. (24) as the equilibrium ground-state densities.
It follows from the Hohenberg-Kohn variational principle of DFT [12],
δ{Ev[ρ] − μN[ρ]} = 0,
where Ev[ρ] = v(r) ρ(r) dr + F[ρ], N[ρ] = ∫ ρ(r) dr, and the global chemical potential μ is the Lagrange multiplier for the density normalization constraint N[ρ] = N, from which the global Euler Eq. (24) directly follows, that the ground state molecular density minimizes the energy density functional Ev[ρ] subject to the subsidiary condition of the specified number of electrons:
minρ {Ev[ρ] − μ(N[ρ] − N)} = Ev[ρ[N, v]] ≡ E[N, v],
and hence
μ = μ[ρ[N, v]] ≡ μ[N, v] = (∂E[N, v]/∂N)v.
This energetical variational principle searches for the minimum of the electronic energy of a molecular system, and delivers the ground-state density matching the fixed external potential of the Born-Oppenheimer approximation: ρ =ρ[N, v].
The Levy constrained search construction [16] of the universal functional F[ρ] ≡ F[ρN],
F[ρN] = min ψρ(N)ψ(N)|Te(N)+ Vee(N)|ψ(N)〉 ≡ F[ρ, N],
where Te(N) and Vee(N) are the operators of the kinetic and repulsion energies of N electrons, respectively, searches over all wavefunctions of N electrons yielding the specified electron density ρN. Since a given ground-state density also fixes the system electronic energy, the Levy construction can be considered as “entropic” in character, by analogy to the ordinary thermodynamics [17], with the value of the universal functional being determined by the search for constant energy E[N, v] = Ev[ρ[N, v]]. The physical nature of this search is revealed through the Legendre transformed interpretation of F[ρ, N] [18,19,20,21], which defines the thermodynamic potential for the system specified by the ground-state density alone, in the spirit of the Hohenberg-Kohn theorems [12]:
F[ρ, N] = maxϕ {E[N, ϕ] − {∂E[N, ϕ]/∂ϕ(r)}N ϕ(r) dr}
  = maxϕ {E[N, ϕ] − ∫ ρ(r) ϕ(r) dr} = E[N, v] − ∫ ρ(r) v(r) dr,
where we have used the Hellmann-Feynman theorem:
{∂E[N, v]/∂v(r)}N = ρ(r).
Therefore, the construction of F[ρN] can be viewed as a search for the external potential ϕ(r), which matches a given ground-state density: ϕ = v[ρN] = v[ρ].
It follows from Eqs. (21) and (23) that the total electronic energy of a molecular system,
Ev[ρ] = ∫ v(r) ρ(r) dr+ F[ρ] = ∑α Ev[ρα] + Fn[ρ] ≡ Ev[ρ],
so that [see Eqs. (23)-(25)]
{∂Ev[ρ]/∂ρα(r)}v,β≠α = {∂εα[ρ, v]/∂ρα(r)}v,β≠α = μα[ρ, v] = μ[ρ],  α = A, B, C, …
These Euler equations, resulting from the associated variational principle of the system energy in the subsystem resolution,
δ{Ev[ρ] − μα N[ρα]} = 0,  or  vαeff(r) = μδF[ρα]/δρα(r), α = A, B, C, …,
indicate that such a search for the minimum electronic energy Ev[ρ(N)] ≡ E[N, v], of the mutually open embedded subsystems in the externally closed molecule, can be interpreted as determining the subsystem densities matching the effective subsystem potentials of Eq. (22) for the specified subsystem electron populations N = {Nα} and the fixed molecular external potential v(r) due to the atomic nuclei: veff(r) = veff[ρ[N, v]; r] ≡ veff[N, v; r].
Replacing the external potential v(r) by its conjugate ρ(r) [Eq. (32)] in the list of state parameters again defines the Legendre transform F[ρN] = Ev[ρ] − Ev[ρ]/δv(r)}ρ v(r) dr = F[ρ] as the relevant thermodynamic potential for this representation, in which the subsystem densities are the only state variables, which completely determine both the state of all subsystems and the molecular system as a whole. One can similarly define the related Legendre transform of the embedded subsystem energy of Eq. (23), for which the v-conjugate is the subsystem density [compare Eq. (32)]:
{∂εα[ρ, v]/∂v(r)}ρ = ρα(r),
Fα[ρN] = εα[ρ, v] − {∂εα[ρ, v]/∂v(r)}ρ v(r) dr} = F[ρα] + Fn[ρ].
This functional also results from the following extremum principle [see Eq. (31)]:
maxϕ {εα[ρ, ϕ] − ∫ ρα(r) ϕ(r) dr} = εα[ρ, v] − ∫ ρα(r) v(r) dr = Fα[ρN], α = A, B, C, …,
Finally, as we have already remarked above [see Eqs. (22), (24) and (34)], a collection of subsystem densities can be viewed as consisting of independent components of the overall molecular density, each coupled to its own effective external potential. In the language of DFT we could regard such a description as resulting from the adiabatic connection [22] from the real system, consisting of interacting subsystems, to the hypothetical system, consisting of non-interacting subsystems, with the same subsystem densities as those in the real system [23]. This collection of non-interacting subsystems can be obtained by scaling to zero the inter-subsystem electronic repulsion to zero, while retaining the full electron interaction within each subsystem, and by simultaneously (and separately) modifying the scaled external potentials of subsystems φ(r) = {φα(r)} in such a way, that the interacting subsystem densities will remain unchanged. It then follows from our previous discussion, that the matching subsystem external potentials in the non-interacting subsystem limit, φs(r) = {φαs(r)}, must be identical with veff[ρ; r] of Eq. (22): φs(r) = veff[ρ; r].
The energy of such an effectively decoupled, open subsystem α, corresponding to a given ground-state molecular density ρ, ρα = ρα [ρ[ρ]],
Eα[ρα, φαs] = ∫ ρα(r) φαs(r) dr + F[ρα],
determines the conjugates of these two local state-variables[see Eqs. (24) and (32, 36)]:
{∂Eα/∂φαs(r)}ρ = ρα(r)  and  {∂Eα/∂ρα(r)}φ = φαs(r) + δF[ρα]/δρα(r) = μα = μ,
where the subscripts ρρα and φφαs. The Legendre transform of the subsystem energy, corresponding to the representation, in which φαs = φαs[ρα] is replaced by ρα in the list of the subsystem state-functions,
F[ρα] = Eα[ρα, φα] − {∂Eα/∂ϕα(r)}ρ φα(r) dr = Eα[ρα, φα] − ∫ ρα(r) φα(r) dr,
is determined by the maximum principle with respect to the subsystem effective external potential,
maxφ {Eα[ρα, φ] − ∫ ρα(r) φ(r) dr} = Eα[ρα, φαs] − ∫ ρα(r) φαs(r) dr = F[ρα],
in which one searches for the effective external potential of the embedded subsystem which matches its electron density ρα(r), φαs = vαeff = vαeff[ρ[ρ]].
The above “vertical” development can be summarized in terms of the following three basic postulates [5] of the information-theoretic, entropic theory of partitioning the fixed molecular density into densities of molecular fragments, e.g., AIM, reactants, functional groups, etc. These elements of the local “thermodynamic” description of the equilibrium partitioning of molecular density are in close analogy to the basic postulates of the ordinary thermodynamics [17]:
  • Postulate I: Equilibrium Partitionings.
    Among all possible divisions of the molecular density ρ into the subsystem densities there exist particular fragments (called the equilibrium ones) that are characterized completely by ρ and the reference densities of free subsystems, represented by the Hirshfeld fragments of Eqs. (13) and (15, 16).
  • Postulate II: Minimum Entropy Deficiency.
    There exists a functional called entropy deficiency, ΔS, of the extensive subsystem parameters ρ = {ρα}, α = A, B, C, …, of any composite molecular system M = ABC…, defined for the equilibrium partitioning of ρ and having the following property: the values assumed by the extensive state-parameters in the absence of the internal constraints veff = {vαeff} are those that minimize ΔS over the manifold of the constrained equilibrium states.
  • Postulate III: Additivity of Entropy Deficiency.
    The entropy deficiency of a composite system is additive over the constituent components.

4. Information Distance Analysis of Molecular Electron Densities

An important property of the “stockholder” molecular fragments is manifested by their equalization of the local values of a measure of the entropy deficiency density at the corresponding global value, for the system as a whole [5,8,9]. An example of such a local information quantity is the Kullback-Leibler (directed divergence) integrand Δsα[ραH; r] [Eq. (14)] per single electron of the promolecule:
sα[ραH; r] ≡ Δsα[ραH; r]/ρα0(r) = [ραH(r)/ρα0(r)] ln[ραH(r)/ρα0(r)] ≡ WαH(r) Iα [WαH(r)]
  = s[ρ; r] ≡ Δs[ρ; r]/ρ0(r) = [ρ(r)/ρ0(r)] ln[ρ(r)/ρ0(r)] ≡ W(r)I(r),
where I(r) ≡ ln[ρ(r)/ρ0(r)] = ln[W(r)] is the global surprisal function, identical with that characterizing any Hirshfeld subsystem: Iα[WαH(r)] = ln[ραH(r)/ρα0(r)] = I(r). Therefore, the following equalization of the local missing information takes place for the “stockholder” fragment densities:
sA [ρAH; r] = sB [ρBH; r] = … = s[ρ; r].
The corresponding Kullback (divergence) density [see Eq. (20)],
dα [ραH; r] ≡ Δdα [ραH; r]/ρα0(r)} = d[ρ; r] ≡ Δd[ρ; r]/ρ0(r) = [W(r) − 1] I(r),
is also inter-subsystem equalized:
dA [ρAH; r] = dB [ρBH; r] = … = d[ρ; r].
Alternative measures of the local information distance relative to the corresponding reference density are defined by the entropy deficiency intensive conjugates of the extensive (density) state-variables. The Kullback-Leibler functional gives:
Sα [ραH; r] ≡ δΔSα[ραH|ρ\α0]/δρ\αH(r) = S[ρ; r] ≡ δΔS[ρ|ρ0]/δρ(r) = I(r) + 1,
and hence:
SA[ρAH; r] = SB [ρBH; r] = ... = S[ρ; r], or IA [WAH(r)] = IB [WBH(r)] = ... = I(r).
The entropy deficiency intensive conjugate resulting from Kullback’s divergence functional,
Dα [ραH; r] ≡ δΔDα[ραH, ρα0] /δραH(r) = D[ρ; r] ≡ δΔD[ρ, ρ0]/δρ(r) = I(r) + 1 − W(r)−1,
is also inter-subsystem equalized:
DA [ρAH; r] = DB [ρBH; r] = ... = D[ρ; r].
Several approximate, semi-quantitative relations linking the above information distance densities with the density difference function Δρ(r) of Eq. (1) have been derived and numerically tested for selected linear molecules [8,9]. They result from the observation, that in general the molecular density is only slightly changed relative to the promolecule density, as a result of the mainly valence shell reconstruction of the electron distribution:
ρ(r)|≡|ρ(r) − ρ0(r)|<< ρ(r) ≅ ρ0(r), W(r) = ρ(r)/ρ0(r) ≈ 1.
Therefore, the first-order Taylor expansion of the global surprisal function gives:
I(r) ≅ Δρ(r)/ρ0(r) ≈ Δρ(r)/ρ(r),
and the associated approximate expressions for the KL and K integrands:
Δs[ρ; r] = ρ(r) ln[ρ(r)/ρ0(r)] ≅ [ρ(r)/ρ0(r)] Δρ(r) ≈ Δρ(r),
Δd[ρ; r] = [ρ(r) − ρ0(r)] ln[ρ(r)/ρ0(r)] ≅ [Δρ(r)]2/ρ0(r).
The corresponding approximate expressions in terms of Δρ(r) for the remaining global entropy deficiency densities, identical with the corresponding quantities for the Hirshfeld subsystems, read:
s[ρ; r] ≡ W(r) I(r) ≅ W(r) Δρ(r)/ρ0(r) ≈ Δρ(r)/ρ0(r),  S[ρ; r] = I(r) + 1 ≅ W(r),
d[ρ; r] = [W(r) − 1] I(r) ≅ [Δρ(r)/ρ0(r)]2,
D[ρ; r] = I(r) + 1 − W(r)−1 ≅ Δρ(r)/{ρ(r) ρ0(r)/[ρ(r) + ρ0(r)]}
= Δρ(r)/Avh [ρ(r), ρ0(r)] ≈ I(r),
where in the last equation Avh[ρ(r), ρ0(r)] stands for the harmonic average of the two densities, defining the so called reduced density.
The approximate expressions of Eq. (55) also attribute to the familiar Δρ(r) function a new information theoretic content. It follows from the above expressions that the dominant feature of these alternative missing information densities is the global surprisal function I(r), related to the density difference per electron [Eq. (52)] and indicating the regions of the increased , I(r) > 0, or decreased, I(r) < 0, entropy deficiency with respect to the atomic promolecule reference. Information distance density plots can thus serve as additional tools, complementary to the familiar density difference diagrams, for diagnosing the electronic origins of the chemical bond. They exhibit typical displacements, reminiscent of those observed in the associated density difference diagrams, e.g., the contraction of the overall electron distribution in a molecule, the changes due to the bond covalency and/or ionicity [charge transfer (CT)], and those due to the accompanying atomic orbital hybridization, etc.

5. Information Distance Affinities for the Charge-Transfer in the Donor-Acceptor Reactive Systems

Consider now the CT for the fixed external potential v(r) in the A→B reactive system, consisting of the B(basic) and A(acidic) reactants. For such processes in the externally closed A−B system, for which NA + NB = N = const., the current overall electron populations in both complementary subsystems, N = (NA, NB), resulting from the integration of the reactant densities ρ = (ρA, ρB), determine the current amount of CT,
NCT = NANA0 = NB0NB > 0,
which represents the independent reaction “coordinate” for such an internal electronic displacement.
The subsystem densities for the specified N, ρ = ρ(N), can be obtained from the following minimum entropy deficiency principle (ΔS[ρ|ρ0] ≡ ΔSKL[ρ|ρ0]) [5,8,9]:
δ{ΔS[ρ|ρ0] − ∑αλ α ∫ ρα(r) dr} = 0,
where, for the case of the externally closed A-B system, i.e., for the fixed value of N = NA + NB, only one Lagrange term, due to the complementary subsystem density-normalization constraints, say for A, is required to simultaneously enforce the specified numbers of electrons on both reactants.
Should one additionally require that the optimum subsystem densities reproduce a given molecular density, ∑β ρβ (r) = ρ(r), as is the case in the Hirshfeld division scheme, one has to include the additional local constraint of the exhaustive partitioning [as in Eq. (15)]:
δ{ΔS[ρ|ρ0] − ∫ λ(r) [ρA(r) + ρB(r)] drλA ρA(r) dr} = 0,
since the overall density already constrains the overall number of electrons: ∫ ρ dr = N. The corresponding Euler equations for the unknowns {NA, ρA, ρB} are [9]:
FA∫ λ(r)[ fA,A(r) + fA,B(r)] drλA = 0, SA(r) − λ(r) − λA = 0, SB(r) − λ(r) = 0,
where the Fukui-functions (FF) [24] of reactants f(r) ≡ {fα,β(r) = [∂ρβ(r)/∂Nα]β≠α} and the subscript βα stands for the fixed electron populations of the other subsystem, satisfy the usual normalizations [20,25]:
∫ fβ≠α(r) dr = [∂Nβ/∂Nα]β≠ α = 0  and   ∫ fβ =α(r) dr = [∂Nα/∂Nα]β≠α = 1.
Equations (59) have the following solutions:
ρA(r) ≡ρA[N, ρ; r] = ρA0(r) C(r) D  and  ρB(r) ≡ ρB[N, ρ; r] = ρB0(r) C(r),
where ln C(r) = λ(r) − 1 and ln D = λA. The C(r) function can be determined from the local constraint:
C(r) = ρ(r)/[A0(r) + ρB0(r)],
with the constant D satisfying the integral equation: NA = ∫ ρA dr, to be solved numerically. It should be observed, that the Hirshfeld densities are recovered for D = 1, i.e., for λA = 0, when the extra global constraint term of Eq. (58) identically vanishes, thus yielding the minimum entropy deficiency principle of Eq. (15).
Let us now introduce the entropy deficiency conjugates of the subsystem numbers of electrons [8,9],
Fα ≡ ∂ΔS/∂Nα = ∑β [∂ρβ (r)/∂Nα] [δΔSβρβ (r)] dr = ∑β ∫ fα,β (r) Sβ (r) dr, α = A, B,
which define the local “intensive”, “force” parameters associated with these “extensive” state variables of reactants. The derivatives FA and FB define the entropic force CT for the internal CT inside the externally closed A−B system [9], for which N = NA + NB = const. or dNA = − dNB = dNCT > 0, defined by the entropy deficiency representation “intensive” conjugate of the amount of CT of Eq. (56):
CT ≡ ∂ΔS/∂NCT = ∑β (∂Nβ /∂NCT) (δΔSNβ ) = FAFB
= {[fA,A(r) − fB,A(r)] SA(r) + [fA,B(r) − fB,B(r)] SB(r)} dr
{fACT(r) SA(r) + fBCT(r) SB(r)} dr,
where the reactant in situ FF {fαCT(r) = ∂ρα (r)/∂NCT} = fCT(r) = [fA,A(r) − fB,A(r), fA,B(r) − fB,B(r)] [20,25]. The derivatives of Eqs. (63) and (64) combine the entropic densities {Sα(r)} of reactants and the corresponding subsystem FF. The generalized force CT can be called, by analogy to the irreversible thermodynamics [17], the CT affinity of the A−B system.
Consider now the molecular density constrained CT, of the “vertical” electronic structure problem, for which the Hirshfeld electron populations of both subsystems, NH = {NαH= ∫ ραH dr}, with SA(r) = SB(r) = S(r), determine the optimum amount of CT: NCTH = NAHNA0 = NB0NBH. Indeed, from the Euler Eqs. (59) it immediately follows that the CT affinity exactly vanishes for the Hirshfeld reactants [9]:
(CT)ρH = (∂ΔS/∂NCT)ρH = {[fA,A(r) + fA,B(r)] − [ fB,B(r) + fB,A(r)]}ρ S(r) dr
  = {[∂ρ (r)/∂NA] ρ − [∂ρ (r)/∂NB] ρ} S(r) dr = [∂ρ (r)/∂NCT]ρ,N S(r) dr = 0.

6. Fukui Function Descriptors of Hirshfeld Reactants

Consider again the A−B molecular reactive system and its A0−B0 promolecule reference, the latter consisting of the free reactant densities brought to their current positions at a finite separation between the two subsystems. It should be observed, that this hypothetical state also corresponds to the electrostatic stage of the interaction between the two complementary subsystems in A−B, when their electron distributions and geometries of the separated reactant limit (SRL) are held “frozen” in the reactive system at finite inter-reactant separations. When the Hirshfeld partitioning of the known overall ground-state density of A−B is performed, one obtains the uniquely defined, equilibrium subsystems of such a molecular reactive system.
Both A0−B0 and A−B then constitute a collection of the unique reactant subsystems, before and after their interaction at finite distances, respectively. It is of interest in the theory of chemical reactivity [20,25] to determine how the reactivity indices of these reactant pieces of the overall density change as a result of this interaction, and how their response properties relate to those of the system as a whole, at both these limits: the molecular, in A−B, and the corresponding SRL quantities, in A0−B0 [9].
Let us first examine the promolecule A0−B0, defined by the subsystem densities {ρα0 = ρα0(N0)} ≡ ρ0 = (ρA0, ρB0) and the overall density ρ0 = ρA0 + ρB0. A number of related FF-type derivatives of the electronic densities with respect to either N0 = (NA0, NB0) or N0 = NA0 + NB0 can be defined for this reference system [9], e.g.,
f0(r) ≡ {fα,β0(r) ≡ ∂ρβ0(r)/∂Nα0} ≡ ∂ρ0(r)/∂N0, f0(r) ≡ ∂ρ0(r)/∂N0, Φ0(r) = ∂ρ0(r)/∂N0.
Similar density derivatives, with respect to either NH = (NAH, NBH) or N = NAH + NBH = N0, can be defined for the molecular system A−B, consisting of the corresponding Hirshfeld reactant densities, {ραH = ραH(NH)} ≡ ρH = (ρAH, ρBH), which sum up to the overall molecular density ρ = ρAH + ρBH:
fH(r) ≡ {fα,βH(r) ≡ ∂ρβH(r)/∂NαH} ≡ ∂ρH(r)/∂NH, f(r) ≡ ∂ρ(r)/∂N, ΦH(r) = ∂ρH(r)/∂N.
It directly follows from the explicit expressions of Eq. (13) for the densities of the Hirshfeld reactant subsystems,
ρH(r) = [ρAH(r), ρBH(r)] = [ρ0(r)/ρ0(r)] ρ(r) ≡ w0(r)ρ(r), w0(r) = [wA0(r), wB0(r)],
that the FF quantities of the Hirshfeld (“stockholder”) reactant subsystems are the w0(r) fractions of the overall FF [7]:
ΦH(r) = w0(r) [∂ρ(r)/∂N] ≡ w0(r) f(r).
A similar relation can be derived for the partitioning of the promolecule FF, by reversing the roles of the Hirshfeld and the free reactant densities in the minimum entropy deficiency principle [9], so that now ρH(r) play the role of the reference densities, while ρ0 are the optimized densities satisfying the constraint ∑α ρα0(r) = ρ0(r) [compare Eq. (15)]:
δ{ΔS[ρ0|ρH] − ∫ ξ(r) [ρA0(r) + ρB0(r)] dr } = 0.
As expected by analogy to the ordinary Hirshfeld partitioning problem, the solutions of this modified missing information variational rule give:
ρ0(r) = [ρA0(r), ρB0(r)] = [ρH(r)/ρ(r)] ρ0(r) ≡ wH(r)ρ0(r) = w0(r)ρ0(r).
Hence, the differentiation of these optimum free subsystem densities for the frozen Hirshfeld reference, gives:
Φ0(r) = w0(r) [∂ρ0(r)/∂N0] = w0(r) f0(r).
Finally, combining Eqs. (69) and (72) gives the following proportionality relation between FF of the reactant subsystems in these two reactive systems [9]:
Φα0(r)/ΦαH(r) = f0(r)/ f(r),   α = A, B.
It implies, that the Hirshfeld subsystems change their FF in the molecular reactive system, relative to the promolecule reference, in the same proportion determined by the ratio of the corresponding global FF:
ΦA0(r)/ΦAH(r) = ΦB0(r)/ΦBH(r) = f0(r)/ f(r).
It thus follows from this relation that the locally soft (hard) free reactant, of the reactive system promolecule, remains locally soft (hard) as the Hirshfeld reactant subsystem, of the molecular reactive system.
The same proportionality relations follow from the entropy deficiency rules, in which the KL functional is formulated directly in terms of the subsystem FF distributions, instead of the densities used in Eqs. (15) and (70) [9]:
δ{ΔS[ΦH |Φ0] − ∫ ζ(r) [ΦAH(r) + ΦBH(r)] dr} = 0 ⇒  ΦαH(r) = Φ=α0(r) [f(r)/f 0(r)];
δ{ΔS[Φ0|ΦH] − ∫ν(r) [ΦA0(r) + ΦB0(r)] dr} = 0 ⇒  Φα0(r) = ΦαH(r) [f 0(r)/f(r)].
Dividing the solutions of these two variational principles indeed yields Eq. (74). It should be realized, however, that the FF cannot be considered as “probability” distribution, since it can assume negative values. We would like to observe, however, that this quantity has indeed been successfully used in several overlap criteria of molecular similarity.
It has also been demonstrated elsewhere [9], that analogous proportionality rule holds for the local softnesses of the reactant subsystems. It too results from the constrained minimum information distance principle using the entropy deficiency functional formulated directly in terms of the local softness distribution.

6. “Horizontal” Displacements of the Electronic Structure

The previously discussed “vertical” displacements of the electronic structure, from one partitioning of the fixed molecular density ρ(r) to another, are carried out for the constant energy of the system as a whole; only the missing entropy, energies of the embedded subsystems and related effective external potentials, distinguish one partitioning from another. We now turn to a more general problem of the “horizontal” displacement of the molecular electronic structure, along the ground-state energy “surface” [5],
E[ρ] ≡ ∫ v[ρ; r] ρ(r) dr + F[ρ] ≡ E[N[ρ], v[ρ]],
where the external potential v[ρ ; r] changes in such a way, that it always matches a given v-representable ground-state density: ρg.s(r) = ρ[N, v; r] ≡ ρ(r). We therefore consider in the “horizontal” development a transition between the two ground-state densities:
ρ1(r) ≡ ρ[N1, v1; r] → ρ2(r) ≡ ρ[N2, v2; r].
It should be emphasized, that the generalized density functional for the ground-state energy [Eq. (76)] differs from the familiar fixed-v density functional [12] of Hohenberg and Kohn [Eq.(28)]: Ev[ρ] ≡ ∫ v(r) ρ(r) dr + F[ρ] ≡ E[N[ρ], v]. Only for the true ground-state, ρ(r)[N, v; r], Ev[ρ] = E[ρ]; for trial densities ρ’, which do not match the external potential v(r), Ev[ρ’] ≠ E[ρ’].
Let us briefly reexamine the Euler equation determining a given ground-state density ρ. The trial ground-state density ρ’ can be forced to give ρ as the solution of the variational principle through the local constraint ρ’(r) = ρ(r) built into the auxiliary density functional through an appropriate local Lagrange multiplier, ω(r) = ω[ρ; r] [5], as in the ZMP procedure [26,27]. The resulting variational principle for E[ρ],
δ{E[ρ’] – ∫ ω[ρ; r] ρ’(r) dr} = 0,
identifies the Lagrange multiplier function as [see Eq. (24)]:
ω[ρ; r] ≡ δE/δ=ρ(r) = [∂E/∂ρ(r)]v + ∫ [∂E/∂v(r’)]ρ [∂v(r’)/∂ρ(r)]μ dr’
= {v[ρ; r] + δF[ρ]/δρ(r)} + ∫ ρ(r’) [∂v(r’)/∂ρ(r)]μ dr’
μ[ρ] + ∫ η(r, r’) ρ(r) dr’μ[ρ] + N h(r)
where the hardness kernel η(r, r’) ≡ [∂v(r’)/∂ρ(r)]μ = δ2F[ρ]/δρ(r)δρ(r’) [20,24] and the local hardness h(r) ≡ ∫ η(r, r’) [ρ(r)/N] dr’ [28].
Therefore, the local quantity ω[ρ; r] is not equalized throughout the space, since in addition to the global chemical potential, the equalized level of the local chemical potential [Eq. (24)], it also includes the local hardness contribution. The latter vanishes only when one fixes the external potential, as in the Hohenberg-Kohn functional [12], by putting [∂v(r’)/∂ρ(r)]μ = 0.
A general information entropy S (or entropy deficiency ΔS) variational principle [5]:
δ{S[ρ] − ∑k λk Ik[ρ]} = 0,
where λk is the Lagrange multiplier for the k-th constraint, Ik[ρ] = Ik0, represents a device allowing one to assimilate in the optimum density ρ the physical information contained in the constraints (or in the reference densities of ΔS) in the most unbiased manner possible. In the single-component molecular system the natural “thermodynamic” constraints are the fixed number of electrons, N[ρ] = N0, and the fixed energy of the system, E[ρ] = E0. In this particular case the information entropy principle of Eq. (80) reads:
δ{S[ρ] − τ −1E[ρ] + τ −1κ N[ρ]} = 0,
where the global “temperature” related Lagrange multiplier τ−1 = (∂S/∂E)N and the global “chemical potential” related Lagrange multiplier τ−1κ = − (∂S/∂N)E. It should be realized, however, that the above constraints do not identify a single admissible ground-state density, but rather the ensemble of them.
The conjugate, minimum energy principle for constant information entropy then reads:
δ{E[ρ] − τS [ρ] − κ N[ρ]} = 0.
It identifies the two Lagrange multipliers as: τ = (∂E/∂S)N and κ = (∂E/∂N)S, in perfect analogy with ordinary thermodynamics [17].
Consider now the limiting case of a single admissible density ρ’ = ρ. Again, this solution of the information entropy extremum principle of Eq. (80) can be enforced through the local constraint term, as in Eq. (78), including the Lagrange multiplier function ω[ρ; r] of Eq. (79). However, by fixing the ground-state, v-representable density one automatically fixes the number of electrons and the energy of the system, so that the two global constraints in Eq. (81) are redundant. They vanish identically when τ−1 = 0, or τ → ∞. This infinite information theoretic “temperature” then implies the infinite entropy “penalty” in Eq. (82), when the trial density deviates from the exact one. This is reminiscent of the infinite values of the Lagrange multipliers in the ZMP procedure [26,27], which also introduces such a “penalty” in the variational procedure determining the effective one-body potential for a given molecular density.
The above “thermodynamic” description of molecules is in the spirit of earlier thermodynamical transcriptions of DFT [29].

6. Information Distance Approach to Bond Multiplicities and Many-Electron Probabilities

We conclude this review with a brief summary of the information-theoretic approach to the chemical bond “order” problem [10,11]. The entropic character of the bond multiplicity concept of chemistry has been explored within the information theory by interpreting a molecule as a “communication” system (see also Section 2), in which the signals are being transmitted in terms of a finite set of possible allocations of N electrons to m constituent atoms. The one-electron probability schemes, of finding a single electron on the free atoms of the promolecule, or on the bonded atoms in a molecule, respectively, determine the input (source) and output (receiver) probability schemes in such a molecular “communication” system. The corresponding two-electron probabilities, which define the associated conditional two-electron probabilities in atomic resolution, similarly determine the network of communication channels, through which a unit signal (message) is transmitted from the promolecular input to the molecular output. As in real communication channels, the molecular system is characterized by disturbances of a random character (noise), which perturb the transmitted signal. It originates from the electron delocalization throughout the molecule, due to the formation of chemical bonds.
The information theoretic concepts [10,11,14] used to characterize chemical bond multiplicity and its covalent/ionic composition include: the conditional entropy [Eqs. (8), (11d)], the mutual information (information distance) [Eqs. (9), (10), (11e)], and the entropy displacements relative to the corresponding promolecule (separated atom limit, SAL) values [Eqs. (6b), (7)]. The average conditional entropy was found to reflect well the covalent component in model systems, while the mutual information has generated a satisfactory estimate of the ionic part of the chemical bond multiplicity. In the open-shell transition-states, which involve the concerted bond-breaking and bond-forming mechanism, e.g., in the three-atom system, the entropic contributions from the simultaneous distribution of three-electrons are needed, for a correct reproduction of the intuitive chemical bond orders in the SAL and in the atom-diatom limit [11].
Such an information theoretic treatment of chemical bond multiplicities calls for the Hirshfeld fragment resolution of the two- or three-electron probabilities. This has recently been achieved [7] by an appropriate extension the “stockholder” division principle to many-electron probability distributions. The optimum division scheme is obtained as the solution of the following minimum entropy deficiency principle:
δ{ΔSKL[Θ|Θ0] − ∫ λ(r, r’, r’’, …) ∑αβ … ∑γ Θαβ...γ(r, r’, r’’, …) dr dr’ dr’’...} = 0,
where Θ(r, r’, r’’, …) = ∑αβ … ∑γ Θαβ...γ(r, r’, r’’, …) and Θ0(r, r’, r’’, …) = ∑αβ … ∑γ Θαβ...γ 0(r, r’, r’’, …) are the known k-electron probability densities in the AIM resolution, (α,β, ..., γ) = 1, 2, …, m, of the molecular system and its promolecule reference, respectively, given by the corresponding sums of the k-electron probability densities in atomic resolution to be determined, {Θαβ...γ(r, r’, r’’, …)}, and the known free-atom distributions {Θαβ...γ 0(r, r’, r’’, …)}. The Kullback-Leibler entropy deficiency functional is defined in the usual way:
ΔSKL[Θ|Θ0] = ∑αβ … ∑γ∫ Θαβ...γ(r, r’, r’’, …) Iαβ...γ (r, r’, r’’, …) dr dr’ dr’’...,
where the k-electron surprisal Iαβ...γ (r, r’, r’’, …) = ln[Θαβ...γ (r, r’, r’’, …)/Θαβ...γ0 (r, r’, r’’, …)]. The k-electron Lagrange-multiplier function λ(r, r’, r’’, … ) enforces the local constraint
αβ … ∑γ Θαβ...γ (r, r’, r’’, …) = Θ(r, r’, r’’, …),
where Θ(r, r’, r’’, …) stands for the known, molecular k-electron simultaneous probability density, which is to be divided into the optimum AIM resolved pieces, the least distant in their information content from the corresponding atomic contributions of the promolecule.
The Hirshfeld-type solution of the variational principle of Eq. (83) reads [7]:
Θαβ...γ H(r, r’, r’’, …) = [Θαβ...γ0 (r, r’, r’’, …)/Θ0(r, r’, r’’, …)] Θ(r, r’, r’’, …)
wαβ...γ (k),0(r, r’, r’’, …) Θ(r, r’, r’’, …) ≡ Θαβ...γ0(r, r’, r’’, …) W(k)(r, r’, r’’, …).
This information theoretic prescription again calls for the participation of the (αβ...γ) atomic cluster in the k-electron molecular “profit”, Θ(r, r’, r’’, …), in accordance with the cluster local share wαβ...γ(k),0(r, r’, r’’, …) in the overall promolecular “investment”, Θ0(r, r’, r’’, …), determined solely by the relevant k-electron promolecule probability distributions. As already indicated in Eq. (86) this partitioning scheme can also be viewed as locally amplifying all the promolecule cluster probabilities {Θαβ...γ0 (r, r’, r’’, … )} with the same unbiased amplifying factor:
W(k)(r, r’, r’’, …) = Θ(r, r’, r’’, …) / Θ0(r, r’, r’’, …)
  = Θαβ...γ H(r, r’, r’’, …)/Θαβ...γ0 (r, r’, r’’, …) ≡ Wαβ...γ H(k)(r, r’, r’’, …),
common to all mk atomic clusters, representing independent selections of k atoms from m constituent AIM.

7. Conclusion

As we have demonstrated in this survey, there is a wide range of problems in the theory of electronic structure and chemical reactivity, which can already be tackled using concepts and techniques of the information theory. They include the entropic definition of AIM, criteria of molecular similarity, the polarization promotion and the CT stage of the reorganization of atoms, when they form chemical bonds in a molecule, a thermodynamic-like description of molecular systems and the electron transfer phenomena in reactive systems, bond multiplicities, charge sensitivities, etc.
The common-sense Hirshfeld partitioning scheme, which uses the free atom reference of the promolecule, has been given a solid information theoretic basis by demonstrating, that it results from the minimum entropy deficiency (information distance) principle, relative to the promolecule densities of the free atomic fragments. The same approach has resulted in a generalized “stockholder” scheme for dividing molecular many-electron probabilities. This information theoretic treatment of molecular subsystems also allows one to derive useful relations between the response properties (local softnesses or Fukui functions) exhibited by the Hirshfeld molecular fragments.
Several important properties of these entropy deficiency equilibrium and stable “stockholder” pieces of the molecular electron density have been discussed in some detail, which make these molecular fragments attractive concepts for chemical interpretations. The Hirshfeld subsystems satisfy the chemical potential equalization principle, as all the mutually open fragments of the molecular ground-state density, and they locally equalize the subsystem information distance densities with the information distance density for the system as a whole. These missing information densities have been semiquantitatively related to the density difference function, which uses the same promolecule reference and is widely used by chemist in their interpretation of the electronic origins of the chemical bond. With this novel development the importance of the surprisal function of the molecular electron density has been stressed and the density difference function has been attributed a new missing information interpretation.
The presented information theoretic elements of a “thermodynamic” description of the electronic structure of molecules and reactive systems cover both the “vertical”, fixed ground-state density problems, and the “horizontal” transitions between the two ground-state densities. This development emphasizes the importance of the complementary energetic and entropic descriptions, with the information theory providing the hitherto missing the entropic part of the electronic structure interpretations in chemistry. The energetic and “entropic” variation principles in DFT have been discussed. It has been argued, using the relevant Legendre transformed representations of the theory, that the energy minimum principle of DFT yields the ground-state density matching a given external potential due to the nuclei, while the “entropic”, fixed density search of Levy delivers the external potential matching a given v-representable density. The equilibrium criteria for electron distributions in molecular systems have been reexamined and the effective external potential representability of the molecular fragment densities have been discussed within DFT. The generalized forces driving changes in the electronic structure, e.g., the CT affinities, have been defined, which combine the familiar Fukui function response properties of molecular fragments with their information distance densities.
These illustrative applications of the information theory to the electronic structure phenomena demonstrate the theory potential in extracting the chemical interpretation from the calculated electron distributions, in terms of atoms and bonds which connect them in a given molecular environment. It allows one to describe various stages of the atomic density reconstruction and to determine the average uncertainties in transmission of the AIM allocation signals throughout the molecular “communication” system, which can be used to probe the covalent and ionic bond components. We have amply demonstrated how important this novel, complementary tool is for gaining a better understanding of the “chemistry” contained in the calculated molecular electron densities and probability distributions. In the future these information theoretic concepts should facilitate a more direct linkage between the ab initio results of computational quantum chemistry with the intuitive language of chemistry, in which such concepts as AIM, bond multiplicities, promotion energy, amount of charge transfer, electronegativity, and the hardness/softness characteristics of the electron gas in a molecule, are paramount [20,30]. Central to chemistry is also the transferability of characteristic properties of functional groups in a variety of molecular environments. The axiomatic approach to the theory of molecular subsystems [31] reveals that the Hirshfeld partitioning indeed yields AIM and molecular fragments satisfying the objective criteria of transferability developed in this analysis. The information theoretic approach to the instantaneous distributions of electrons in a molecule and the charge flows between molecular fragments has also been developed [32], following the thermodynamic theory of fluctuations and irreversible processes [17].

Acknowledgement

This work was supported by the research grant No. 3T09A14119 from the State Committee for Scientific Research in Poland.

References

  1. Shannon, C. F. Bell System Tech. J. 1948, 27, 379, 623. Shannon, C. E.; Weaver, W. A Mathematical Theory of Communication; University of Illinois Press: Urbana, 1949. [Google Scholar] Abramson, N. Information Theory and Coding; McGraw-Hill: New York, 1963. [Google Scholar] Ash, R. B. Information Theory; Interscience: New York, 1965. [Google Scholar]
  2. Kullback, S.; Leibler, R. A. Ann. Math. Stat. 1951, 22, 79. Kullback, S. Information Theory and Statistics; Wiley: New York, 1959. [Google Scholar]
  3. Fisher, R. A. Proc. Cambridge Phil. Soc. 1925, 22, 700. [CrossRef]
  4. Nalewajski, R.F.; Parr, R.G. Proc. Natl. Acad. Sci. USA 2000, 97, 8879. [CrossRef]
  5. Nalewajski, R.F.; Parr, R.G. J. Phys. Chem. A 2001, 105, 7391. [CrossRef]
  6. Nalewajski, R.F.; Loska, R. Theoret. Chem. Acc. 2001, 105, 374. [CrossRef]
  7. Nalewajski, R.F. Phys. Chem. Chem. Phys. in press. Adv. Quant. Chem. in press.
  8. Nalewajski, R.F.; Świtka, E.; Michalak, A. Int. J. Quantum Chem. 2002, 87, 198. [CrossRef]
  9. Nalewajski, R.F. Phys. Chem. Chem. Phys. submitted.
  10. Nalewajski, R.F. J. Phys. Chem. A. 2000, 104, 11940. [CrossRef]
  11. Nalewajski, R.F.; Jug, K. Reviews in Modern Quantum Chemistry: A Celebration of the Contributions of R.G. Parr; Sen, K. D., Ed.; World Scientific: Singapore, 2002; p. 148. [Google Scholar]
  12. Hohenberg, P.; Kohn, W. Phys. Rev. 1964, 136B, 864. Kohn, W.; Sham, L. Phys. Rev. 1965, 140A, 1133.
  13. Hirshfeld, F. L. Theoret. Chim. Acta (Berlin) 1977, 44, 129. [CrossRef]
  14. Pfeiffer, P. E. Concepts of Probability Theory; Dover: New York, 1978. [Google Scholar]
  15. Sanderson, R. T. J. Am. Chem. Soc. 1952, 74, 272. Parr, R. G.; Donnelly, R. A.; Levy, M.; Palke, W. E. J. Chem. Phys. 1978, 68, 801.
  16. Levy, M. Proc. Natl. Acad. Sci. USA 1979, 76, 6062. [CrossRef]
  17. Callen, H. B. Thermodynamics, an Introduction to the Physical Theories of Equilibrium Thermostatics and Irreversible Thermodynamics; Wiley: New York, 1960. [Google Scholar]
  18. Nalewajski, R.F.; Parr, R.G. J. Chem. Phys. 1982, 77, 399. [CrossRef]
  19. Lieb, E. Int. J. Quant. Chem. 1983, 24, 243. [CrossRef]
  20. Nalewajski, R. F.; Korchowiec, J. Charge Sensitivity Approach to Electronic Structure and Chemical Reactivity; World Scientific: Singapore, 1997. [Google Scholar]
  21. Colonna, F.; Savin, A. J. Chem. Phys. 1999, 110, 2828. [CrossRef]
  22. See, e.g.: Dreizler, R. M.; Gross, E.K.U. Density Functional Theory: An Approach to the Quantum Many-Body Problem; Springer-Verlag: Berlin, 1990. [Google Scholar]
  23. Nalewajski, R. F. Adv. Quant. Chem. 2000, 38, 217.
  24. Parr, R. G.; Yang, W. J. Am. Chem. Soc. 1984, 106, 4049. [CrossRef]
  25. Nalewajski, R. F.; Korchowiec, J.; Michalak, A. Density Functional Theory IV: Theory of Chemical Reactivity; Nalewajski, R. F., Ed.; Topics in Current Chemistry; Springer Verlag, 1996; Vol. 183, p. 25. Nalewajski, R. F. Int. J. Quant. Chem. 1997, 61, 181. Nalewajski, R. F. Development in the Theory of Chemical Reactivity and Heterogeneous Catalysis; Mortier, W. M., Schoonheydt, R. A., Eds.; Research Signpost: Trivandrum, 1977; p. 135. [Google Scholar] Nalewajski, R. F. Phys. Chem. Chem. Phys. 1999, 1, 1049. Nalewajski, R. F. Reviews in Modern Quantum Chemistry: A Celebration of the Contributions of R.G. Parr; Sen, K. D., Ed.; World Scientific: Singapore, 2002; p. 1071. [Google Scholar] Nalewajski, R. F. Chem. Phys. Lett. 2002, 353, 143.
  26. Zhao, Q.; Parr, R. G. Phys. Rev. A 1992, 46, 237. J. Chem. Phys. 1993, 98, 543. Zhao, Q.; Morrison, R. C.; Parr, R. G. Phys. Rev. A 1994, 50, 2138.
  27. Parr, R. G.; Wang, Y. A. Phys. Rev. A 1997, 55, 3226. [CrossRef]
  28. Berkowitz, M.; Ghosh, S. K.; Parr, R. G. J. Am. Chem. Soc. 1985, 107, 6811. [CrossRef]
  29. Ghosh, S. K.; Berkowitz, M.; Parr, R.G. Proc. Natl. Acad. Sci. USA 1984, 81, 8018. Nagy, A.; Parr, R. G. Indian Acad. Sci. (Chem. Sci.) 1994, 106, 217.
  30. Parr, R. G.; Yang, W. Density Functional Theory of Atoms and Molecules; Oxford: New York, 1989. [Google Scholar]
  31. Ayers, P. W. J. Chem. Phys. 2000, 113, 10886.
  32. Nalewajski, R. F. J. Phys. Chem. A. submitted.

Share and Cite

MDPI and ACS Style

Nalewajski, R.F. Applications of the Information Theory to Problems of Molecular Electronic Structure and Chemical Reactivity. Int. J. Mol. Sci. 2002, 3, 237-259. https://doi.org/10.3390/i3040237

AMA Style

Nalewajski RF. Applications of the Information Theory to Problems of Molecular Electronic Structure and Chemical Reactivity. International Journal of Molecular Sciences. 2002; 3(4):237-259. https://doi.org/10.3390/i3040237

Chicago/Turabian Style

Nalewajski, Roman F. 2002. "Applications of the Information Theory to Problems of Molecular Electronic Structure and Chemical Reactivity" International Journal of Molecular Sciences 3, no. 4: 237-259. https://doi.org/10.3390/i3040237

Article Metrics

Back to TopTop