Next Article in Journal
Kinetic Theory Modeling and Efficient Numerical Simulation of Gene Regulatory Networks Based on Qualitative Descriptions
Next Article in Special Issue
A Simple Decoder for Topological Codes
Previous Article in Journal
Computing Bi-Invariant Pseudo-Metrics on Lie Groups for Consistent Statistics
Previous Article in Special Issue
Generalized Remote Preparation of Arbitrary m-qubit Entangled States via Genuine Entanglements
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Statistical Correlations of the N-particle Moshinsky Model

1
Institute of Atomic and Molecular Sciences, Academia Sinica, Taipei 10617, Taiwan
2
Department of Physics, National Taiwan University, Taipei 10617, Taiwan
*
Author to whom correspondence should be addressed.
Entropy 2015, 17(4), 1882-1895; https://doi.org/10.3390/e17041882
Submission received: 17 February 2015 / Revised: 27 March 2015 / Accepted: 27 March 2015 / Published: 31 March 2015
(This article belongs to the Special Issue Quantum Computation and Information: Multi-Particle Aspects)

Abstract

:
We study the correlation of the ground state of an N-particle Moshinsky model by computing the Shannon entropy in both position and momentum spaces. We have derived the Shannon entropy and mutual information with analytical forms of such an N-particle Moshinsky model, and this helps us test the entropic uncertainty principle. The Shannon entropy in position space decreases as interaction strength increases. However, Shannon entropy in momentum space has the opposite trend. Shannon entropy of the whole system satisfies the equality of entropic uncertainty principle. Our results also indicate that, independent of the sizes of the two subsystems, the mutual information increases monotonically as the interaction strength increases.

1. Introduction

Since understanding the correlation of quantum many body problems is crucial to quantum information processes, and quantum systems described as harmonically confined systems with tunable interaction parameters are promising for development in quantum information processes, such quantum systems hence provide us a motivation to study correlation in a solvable many body system—an N-particle Moshinsky model.
In the Moshinsky model [1], the system is confined in harmonic traps and inter-particle interaction also takes a harmonic form. The entropies in this system can be solved exactly, helping us investigate correlations and test the entropic uncertainty principle of two subsystems containing arbitrary numbers of particles. Several topics about the correlation of Moshinsky model, such as the statistical and quantum correlation of the two-electron Moshinsky model [24], three-electron Moshinsky model and applying a uniform magnetic field in the two-electron model [5], and the quantum correlation in N-particle Moshinsky model [6], have been studied recently. In our present work, we focus on three topics: understanding statistical correlations, testing the entropic uncertainty principle, and comparing the statistical correlation to the quantum correlation of an N-particle Moshinsky model.
For the first topic, to describe the statistical information in the system, we introduce Shannon entropy, which is a measure of uncertainty of the random variables. Usually, Shannon entropy is applied to describe delocalization or localization of the system. Using different phase-spaces to measure Shannon entropy will lead to different expressions. Studies of Shannon entropy in position space and momentum space of atomic systems have been carried out [714]. We calculate Shannon entropy in position basis and momentum basis to discuss the correlation. From information theory [15], mutual information [16] is a general measure of correlation of two subsystems, and it has been applied to study correlation in systems [1719]. In an N-particle Moshinsky model, we discuss the correlation between two subsystems, one containing p particles and the other containing N-p particles. We can derive analytic wave function in position and momentum spaces, and from which Shannon entropy in both spaces can be calculated, leading to understanding of the relationship between these three factors, p, N, g (coupling coefficient of the Moshinsky model).
In order to discuss the statistical correlations of the system, we show the definition of the quantities we consider here. In [2], the definition of Shannon entropy, one-particle Shannon entropy and mutual information in position space is given, and we extend the definition to a system with N particles as follows:
S p o s = d x 1 d x N Γ ( x 1 , , x N ) ln Γ ( x 1 , , x N ) , S p o s ( p ) = d x 1 d x p γ ( x 1 , , x p ) ln γ ( x 1 , , x p ) , I p o s ( p , N p ) = S p o s ( Γ ( x 1 , , x N ) γ ( x 1 , , x p ) γ ( x p + 1 , , x N ) ) ,
where xi is the position of i-th particle. Spos and S p o s ( p ) are Shannon entropy of the whole system calculated by Γ(x1,…, xN), the probability density function in position space, and the p-particle Shannon entropy in position space calculated by γ ( x 1 , , x p ) = d x p + 1 d x N Γ ( x 1 , , x N ), the reduced probability density function, respectively. I p o s ( p , N p ) is mutual information of the composite system consists of two groups: a group with p particles and the other group N-p particles, and it is defined by the relative entropy between the distribution Γ(x1,…, xN) and γ ( x 1 , , x p ) γ ( x p + 1 , , x N ), and it can be calculated in a simple formula: I p o s ( p , N p ) = S p o s ( p ) + S p o s ( N p ) S p o s. Note that when p=N, the p-particle Shannon entropy is just the Shannon entropy of the whole system, i.e., S p o s ( N ) = S p o s.
Also, all the counterpart quantities in momentum space can be defined as:
S m o m = d q 1 d q N Λ ( q 1 , , q N ) ln Λ ( q 1 , , q N ) , S m o m ( p ) = d q 1 d q p λ ( q 1 , , q p ) ln λ ( q 1 , , q p ) , I m o m ( p , N p ) = S m o m ( Λ ( q 1 , , q N ) λ ( q 1 , , q p ) λ ( q p + 1 , , q N ) ) ,
where qi is momentum of the i-th particle. S m o m and S m o m ( p )are Shannon entropy of the whole system calculated by Λ(q1,…,q2), the probability density function in momentum space, and the p-particle Shannon entropy in momentum space calculated by λ ( q 1 , , q p ) = d q p + 1 d q N Λ ( q 1 , , q N ), the reduced probability density function, respectively. I m o m ( p , N p ) is the mutual information in momentum space, and it can be calculated in a simpler formula: I m o m ( p , N p ) = S m o m ( p ) + S m o m ( N p ) S m o m .
For the second topic, the entropic uncertainty principle has been investigated in [20], and entropic uncertainty relations in atomic systems were discussed in some studies [2,21]. In this model, by calculating Shannon entropies in position and momentum space, we can test the entropic uncertainty principle [20], as:
S p o s ( p ) + S m o m ( p ) p ( 1 + ln π ) , S p o s + S m o m N ( 1 + ln π ) .
From the entropic uncertainty principle, the sum of entropy in phase-spaces, Spos +Smom, can be considered as the entropy of a product distribution in phase-spaces, Γ ( x 1 , , x N ) Λ ( q 1 , , q N ), and such a way of thinking is also valid for the case of reduced distribution.
For the third topic, comparing to Shannon entropy, von Neumann entropy is a measure of quantum information and is widely used in many atomic systems [2227]. For a bipartite pure state, von Neumann entropy is half of the quantum mutual information [28]; therefore it can also be a good measure of quantum correlation. The eigenvalue structure of N-particle Moshinsky model has been studied in [29], and results of von Neumann entropy have been given by [6]. Shannon entropy does not equal to von Neumann entropy most of time. However, we can compare the behavior of statistical and quantum correlation to three factors p, N, g. In this article, we discuss the first topic in Sections 3.1 and 3.2, the second topic in Section 3.3, and the third topic in Section 3.4.

2. Moshinsky Model

For the N-particle Moshinsky model, the Hamiltonian is:
H = i = 1 N ( 2 2 m d 2 d x i 2 + 1 2 m ω 2 x i 2 ) + 1 i < j N κ ω 2 ( x i x j ) 2 .
Take the scaled unit ( x m ω x , E ω E), and let g = κ m. The Hamiltonian of Equation (4) turns into:
H = i = 1 N ( 1 2 d 2 d x i 2 + 1 2 x i 2 ) + 1 i < j N g ( x i x j ) 2 .
In order to solve the wave function of this system, we transform the original coordinates into Jacobi coordinates:
{ X = x 1 + + x N N , X i = i 1 i ( x i 1 i 1 k = 1 i 1 x k ) , i = 2 , 3 , , N .
By using such transformation, we can separate the Hamiltonian in Jacobi coordinates:
H = ( 1 2 d 2 d X 2 + 1 2 X 2 ) + i = 2 N ( 1 2 d 2 d X i 2 + 1 2 Ω 2 X i 2 ) = H X + i = 2 N H X i .
Here Ω = 1 + 2 N g, and now we can derive the exact wave function for this system:
ψ ( n C M , n 2 , , n N ) ( X , X 2 , , X N ) = ( 1 2 n C M n C M ! ) ( 1 π ) 1 4 e X 2 / 2 H n C M ( X ) i = 2 N ( 1 2 n i n i ! ) ( Ω π ) 1 4 e Ω 2 i = 2 N X i 2 H n i ( Ω X i ) ,
where the quantum numbers (nCM, n2,…,nN) label the state in position space, and all the quantum numbers must be non-negative integer, and Hn(x) is the Hermite polynomial.
On the other hand, to derive the wave function in momentum space, we can apply the Fourier transform to Equation (8). However, a simpler way to derive such wave function is to rewrite the Hamiltonian in momentum coordinates, which is given by:
H = 1 2 Q 2 1 2 d 2 d Q 2 + i = 2 N ( 1 2 Q i 2 1 2 Ω 2 d 2 d Q i 2 ) .
In Equation (9), we use the Jacobi basis in momentum space, which is Q = q 1 + q 2 + + q n N , Q i = i 1 i ( q i 1 i 1 k = 1 i 1 q k ). Thus, the wave function in momentum space is:
ψ ( m C M , m 2 , , m N ) ( Q , Q 2 , , Q N ) = ( 1 2 m C M m C M ! ) ( 1 π ) 1 4 e Q 2 / 2 H m C M ( Q ) i = 2 N ( 1 2 m i m i ! ) ( 1 π Ω ) 1 4 e 1 2 Ω i = 2 N Q i 2 H n i ( Q i Ω ) ,
where the quantum numbers (mCM, m2,…,mN) label the state in momentum space, and all the quantum numbers must be non-negative integer. By obtaining the wave function in position space and momentum space we can calculate Shannon entropies in both spaces.

3. Shannon Entropy and Testing Entropic Uncertainty Principle

3.1. Position Space

The ground state wave function of position space in Jacobi coordinate is:
ψ ( 0 , 0 , , 0 ) ( X , X 2 , , X N ) = ( 1 π ) 1 4 e X 2 / 2 ( Ω π ) N 1 4 e Ω 2 i = 2 N X i 2 .
To calculate p-particle Shannon entropy, we first construct the p-particle reduced probability density function, as:
γ ( x 1 , x 2 , , x p ) = C p exp [ 1 + Ω ( N 1 ) N 1 i p x i 2 + 2 ( Ω 1 ) N 1 i j p x i x j + ( N p ) ( Ω 1 ) 2 N ( N p + p Ω ) ( 1 i p x i ) 2 ] ,
where C p = π p 2 Ω p 2 [ N p N + p N Ω ] 1 / 2.
Next, we calculate the p-particle Shannon entropy mentioned in the second line of Equation (1) with some special treatment. Let the Jacobi coordinate of p-particle system be Y = 1 p i = 1 p x i , y i = i 1 i ( x i 1 i 1 k = 1 i 1 x k ), and define the following three quantities:
r 2 = i = 2 p y i 2 , a = i = 1 p x i 2 , and b = 1 i j N x i x j .
Thus, we can write down the relation between Y, r and a, b, as:
{ Y 2 = 1 p a + 2 p b r 2 = p 1 p a 2 p b { a = Y 2 + r 2 b = ( p 1 ) Y 2 r 2 2 .
Since γ ( x 1 , x 2 , , x p ) = C p exp [ 1 + Ω ( N 1 ) N a + 2 ( Ω 1 ) N b + ( N p ) ( Ω 1 ) 2 N ( N p + p Ω ) ( a + 2 b ) ], now we can put (14) into the reduced probability density function and obtain:
γ ( Y , y 2 , , y p ) = C p exp [ Ω N N p + p Ω Y 2 Ω r 2 ] = C p exp [ C 1 Y 2 C 2 r 2 ] ,
where C 1 = Ω N N p + p Ω , C 2 = Ω.
We then calculate the p-particle Shannon entropy in Y, y2, , yp coordinates (the determinant of the Jacobian is 1), and the result becomes:
S p o s ( p ) = d Y d y 2 d y p γ ( Y , , y p ) ln γ ( Y , , y p ) = C p π C 1 ( π C 2 ) p 1 ( p 2 ln C p ) = p 2 ln C p .
Take p = N in Equation (16), we obtain Shannon entropy of the whole system in position space. Now, the mutual information in position space can be computed as follows:
I p o s ( p , N p ) = S p o s ( p ) + S p o s ( N p ) S p o s = 1 2 ln Ω + 1 2 ln [ N 2 Ω + ( N p ) p ( 1 Ω ) 2 N 2 ] .
Next, we plot the results for these quantities. For Shannon entropy of the whole system, results are shown in Figure 1. In the weak interaction region (when g is small, which means there is almost no interaction), we observe that the higher number to the total particles, the higher value is to the Shannon entropy; however, in the strong interaction region (when g is large), the trend is opposite. This fact can be deduced from Equation (16) by taking p = N, with:
S p o s = N 2 ln C N = N 2 ( 1 + ln π ) N 1 2 ln 1 + 2 N g d S p o s d g = N ( N 1 ) 2 1 1 + 2 N g .
Therefore, for small g, Shannon entropy is proportional to N, and the derivative of Shannon entropy to the interaction strength g is always negative. The effect of large N is that the decreasing rate is greater, and the distribution is more sensitive to localization when interaction increases.
If we consider p-particle Shannon entropy and fix the total number of particles, the p-particle Shannon entropy would decrease as interaction getting stronger. The larger p number to the p-particle system, the faster decreasing rate is to the p-particle Shannon entropy. For different N, the trend is similar, so we just show results for a system with N = 8 particles in Figure 2, as an example.
For mutual information, when p is closer to N-p the mutual information becomes greater, and the maximum value occurs at p = N 2 . The difference in total interaction between two subsystems is a result of different partitions for such two subsystems. When the size of two subsystems is getting closer, the total interaction between two subsystems is stronger. As a result, when p is closer to N-p the mutual information becomes greater. The Shannon entropy is decreasing as the interaction is getting stronger; however, mutual information is increasing as the interaction strength increases. The mutual information can be considered as the correlation between two subsystems (one part contains p particles, and the other contains N-p particles), so this fact shows that when the interaction strength increases, the correlation of these two subsystems becomes stronger. Figure 3 shows the mutual information when N = 8.

3.2. Momentum Space

We can compute Shannon entropy in momentum space in a similar manner. The ground state wave function of momentum space in Jacobi coordinates is:
ϕ ( 0 , 0 , , 0 ) ( Q , Q 2 , , Q N ) = ( 1 π ) 1 4 e Q 2 / 2 ( 1 π Ω ) N 1 4 e q 2 2 Ω .
Following the same procedure as that in the position space, we calculate Shannon entropy for whole system and p-particle Shannon entropy respectively. The p-particle Shannon entropy is:
S m o m ( p ) = D p π D 1 ( π D 2 ) p 1 ( p 2 ln D p ) = p 2 ln D p ,
where D p = ( 1 π Ω ) p / 2 ( N p N + p N Ω ) 1 / 2 , D 1 = N Ω ( N p ) + p , D 2 = 1 Ω. By taking p=N in Equation (20), Shannon entropy for whole system can be derived.
The mutual information can be derived as well:
I m o m ( p , N p ) = S m o m ( p ) + S m o m ( N p ) S m o m = 1 2 ln Ω + 1 2 ln [ ( N p ) p ( Ω 2 + 1 ) + ( ( N p ) 2 + p 2 ) Ω N 2 ] = I p o s ( p , N p ) .
Shannon entropy of the whole system in momentum space is shown in Figure 4. Unlike in position space, the Shannon entropy increases as interaction strength increases. This means that the momentum space distribution is delocalized as interaction increases. The fact can be derived from Equation (20):
S m o m = N 2 ln D N = N 2 ( 1 + ln π ) + N 1 2 ln 1 + 2 N g d S m o m d g = N ( N 1 ) 2 1 1 + 2 N g .
Since the derivative of Shannon entropy to the interaction strength is positive, Shannon entropy in momentum space is increasing as interaction strength increases. Furthermore, when N is a larger number the rate of increase of the Shannon entropy becomes greater, and the delocalization is more sensitive to the interaction strength.
The p-particle Shannon entropy is increasing as interaction strength increases, and for a given interaction strength, a system with larger p value has larger Shannon entropy as compared to a system with smaller p vale. Here, we show results in Figure 5 for a fixed total number of particles N = 8, as an example.
Next from our results, we discuss the trends for Shannon entropy in different spaces. When the interaction is getting stronger, Shannon entropy in momentum space is increasing, while in position space Shannon entropy is decreasing. An explanation of the opposite trend for Shannon entropy in different spaces is that when interaction becomes stronger, particles tend to become closer in position space (concentrate to some small region); therefore, the uncertainty in position space decreases, which also means that Shannon entropy will decrease in position space. However, in momentum space, when interaction becomes stronger the motion of particles is more chaotic, which means the uncertainty is greater and Shannon entropy is increasing in momentum space. Moreover, by taking suitable parameters, we can reduce our results to the case of total number of particles N = 2, and they are the same as those in [2].
For mutual information in momentum space, the results are exactly the same as those in position space. Although the behaviors to the interaction are different in position and momentum spaces, the mutual information is the same, independent of which phase space we have chosen. It is further observed that mutual information increases as interaction strength is increased.
In Figure 6, we show results for a system with N = 8 in momentum space, indicating that such results in momentum space are exactly the same as those in Figure 3 for position space.

3.3. Relation of Two Spaces and Testing Entropic Uncertainty Principle

The relationship between Shannon entropies in two spaces can be shown in Figure 7. For g 0, Shannon entropy in momentum space is greater than or equal to the Shannon entropy in position space. They are equal to each other only when g = 0. The sum of these two Shannon entropies is a constant only when p = N (the whole system Shannon entropy), and for all other cases, we observe that the entropy sum increases for increasing interaction strength.
According to the entropic uncertainty principle [20], inequality (3) should be satisfied. We now can show our results indeed satisfy these two inequalities. From Equations (18) and (22), we have obtained S p o s + S m o m = N ln ( C N D N ) = N + N ln π, which is the equality of the uncertainty principle. Furthermore, we find that the sum of Shannon entropy in position and momentum space is independent of interaction strength. On the other hand, from Equations (16) and (20), we can prove that:
S p o s ( p ) + S m o m ( p ) = p ln ( C p D p ) = p ln { ( 1 π ) p [ ( N p N + p N Ω ) ( N p N + p N Ω ) ] 1 / 2 } = p + p ln π + 1 2 ln [ 1 + ( N p ) p N 2 ( Ω 1 Ω ) 2 ] p ( 1 + ln π ) .
The last inequality holds because the argument in ln is greater than 1, so this term is greater than zero, and the equality is satisfied only when p = N or p = 0, or there is no interaction. Also, from Equation (23) it is quite straightforward to see that the sum of p-particle Shannon entropy in these two spaces increases monotonically when the interaction strength is increased.

3.4. Comparing Statistical Correlation to Quantum Correlation

As mentioned in the Introduction, quantum mutual information is twice the von Neumann entropy. The results of von Neumann entropy of the N-particle Moshinsky model are given in [6], thus we can compare classical mutual information with quantum mutual information. Mutual information of position space is the same as in momentum space, so we denote them as I C ( p , N p ), and quantum mutual information is denoted by I Q ( p , N p ), where (p, N-p) indicates the partition of these two subsystems. In Figure 8, we show the comparison of classical and quantum mutual information with total number of particles N = 8, as an example.
For other N, the trends are similar. Classical and quantum mutual information share the same two trends: one trend is that both mutual information are monotonically growing up when the interaction strength increases, and the other trend is that for a given interaction strength, the closer p to N-p, the greater value is to the mutual information. On the other hand, quantum mutual information is growing faster than classical mutual information. This implies that quantum correlation is more sensitive to interaction strength than statistical correlation.

4. Summary and Conclusions

In the present work, we have analytically derived the Shannon entropy in position and momentum spaces for the ground state of an N-particle Moshinsky model. We show results of these entropies for total number of particles N and arbitrary number of particles p in subsystems, and apply such results to three topics: discussing the statistical correlations, testing the entropic uncertainty principle, and comparing the classical mutual information to quantum mutual information for this model.
For the first topic, we have observed that the behaviors of Shannon entropy are different in different phase-spaces; however, the mutual information is the same for both spaces. When the interaction is getting stronger, the Shannon entropy in momentum space is increasing (delocalization), while in position space Shannon entropy is decreasing (localization). Moreover, Shannon entropy is dependent of N and p. The rate of change of the Shannon entropy is larger when N or p is a larger number. In momentum space the rate of change is positive, while in position space the rate of change is negative. The Shannon entropies in both spaces are the same if there is no interaction. When the number p or N is getting larger, the Shannon entropy becomes greater when there is no interaction. Using mutual information as a measurement of correlation, the statistical correlation is the same for both spaces, which implies that correlation in the ground state of an N-particle Moshinsky model is independent of the spaces we have chosen. Mutual information depends on the interaction between particles and the partition of two subsystems, and it increases monotonically when the interaction strength is increased. Furthermore, the mutual information gets a larger value when the numbers of particles in the two subsystems are closer to each other. The maximum mutual information occurs when the sizes of the two subsystems are the same, i.e., p = N-p.
For the second topic, Shannon entropies of the whole system and p-particle Shannon entropy satisfy inequality (3), the entropic uncertainty principle. The Shannon entropy of the whole system always satisfies the equality of entropic uncertainty principle whatever the interaction strength is, while p-particle Shannon entropy only satisfies the equality when interaction is zero. Furthermore, the sum of p-particle Shannon entropy in both spaces is increasing as interaction strength increases, while the sum of Shannon entropy of the whole system remains constant, and is independent of interaction strength.
For the third topic, we show that classical and quantum mutual information have similar behaviors to the interaction strength and to the partition of subsystems. Both monotonically increase for increasing interaction strength, and when the sizes of the two subsystems are closer to each other the mutual information would have a larger value. The increasing rate of quantum mutual information is greater than that of classical mutual information.

Acknowledgments

This work was supported by the Ministry of Science and Technology in Taiwan. We are thankful to the anonymous referees for their valuable suggestions.
PACS code: 03.65. –w; 03.67. –a

Author Contributions

Yew Kam Ho set the research direction and managed the overall progress of this project. Hsuan Tung Peng conceived the present theoretical frame, and carried out numerical calculations for the results presented in the present work. Both authors contributed to analyzing the numerical data and provided physical interpretation of the present results. Both authors contributed to writing the paper, have read and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Moshinsky, H.M. How good is the Hartree-Fock approximation. Am. J. Phys. 1968, 36, 52–53. [Google Scholar]
  2. Laguna, H.G.; Sagar, R.P. Statistical correlations in the Moshinsky atom. Phys. Rev. A 2011, 84, 012502. [Google Scholar]
  3. Yanez, R.J.; Plastino, A.R.; Dehesa, J.S. Quantum entanglement in a soluble two-electron model atom. Eur. Phys. J. D 2010, 56, 141–150. [Google Scholar]
  4. Manzano, D.; Plastino, A.R.; Dehesa, J.S.; Koga, T. Quantum entanglement in two-electron atomic models. J. Phys. A Math. Theor. 2010, 43, 275301. [Google Scholar]
  5. Bouvrie, P.A.; Majtey, A.P.; Plastino, A.R.; Sanchez-Moreno, P.; Dehesa, J.S. Quantum entanglement in exactly soluble atomic models: the Moshinsky model with three electrons, and with two electrons in a uniform magnetic field. Eur. Phys. J. D 2012, 66. [Google Scholar] [CrossRef]
  6. Kościk, P.; Okopińska, A. Correlation effects in the Moshinsky model. Few-Body Syst 2013, 54, 1637–1640. [Google Scholar]
  7. Laguna, H.G.; Sagar, R.P. Indistinguishability and correlation in model systems. J. Phys. A Math. Theor. 2011, 44, 185302. [Google Scholar]
  8. Laguna, H.G.; Sagar, R.P. Phase-space position-momentum correlation and potentials. Entropy 2013, 15, 1516–1527. [Google Scholar]
  9. Laguna, H.G.; Sagar, R.P. Position–momentum correlations in the Moshinsky atom. J. Phys. A Math. Theor. 2012, 45, 025307. [Google Scholar]
  10. Laguna, H.G.; Sagar, R.P. Wave function symmetry, symmetry holes, interaction and statistical correlation in the Moshinsky atom. Physica A 2014, 396, 267–279. [Google Scholar]
  11. Guevara, N.L.; Sagar, R.P.; Esquivel, R.O. Shannon-information entropy sum as a correlation measure in atomic systems. Phys. Rev. A 2003, 67, 012507. [Google Scholar]
  12. Shi, Q.; Kais, S. Finite Size Scaling for the atomic Shannon-information entropy. J. Chem. Phys. 2004, 121, 5611–5617. [Google Scholar]
  13. Sen, K.D. Characteristic features of Shannon information entropy of confined atoms. J. Chem. Phys. 2005, 123, 074110. [Google Scholar]
  14. Chatzisavvas, K.C.; Moustakidis, C.C.; Panos, C.P. Information entropy, information distances, and complexity in atoms. J. Chem. Phys. 2005, 123, 174111. [Google Scholar]
  15. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar]
  16. Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; Wiley: Hoboken, NJ, USA, 1991; pp. 12–49. [Google Scholar]
  17. Sagar, R.P.; Guevara, N.L. Mutual information and correlation measures in atomic systems. J. Chem. Phys. 2005, 123, 044108. [Google Scholar]
  18. Sagar, R.P.; Guevara, N.L. Mutual information and electron correlation in momentum space. J. Chem. Phys. 2006, 124, 134101. [Google Scholar]
  19. Sagar, R.P.; Laguna, H.G.; Guevara, N.L. Conditional entropies and position-momentum correlations in atomic systems. Mol. Phys. 2009, 107, 2071–2080. [Google Scholar]
  20. Bialynicki-Birula, I.; Mycielski, J. Uncertainty relations for information entropy in wave mechanics. Commun. Math. Phys. 1975, 44, 129–132. [Google Scholar]
  21. Guevara, N.L.; Sagar, R.P.; Esquivel, R.O. Information uncertainty-type inequalities in atomic systems. J. Chem. Phys. 2003, 119, 7030–7036. [Google Scholar]
  22. Lin, C.H.; Ho, Y.K. Quantification of entanglement entropy in helium by the Schmidt–Slater decomposition method. Few-Body Syst. 2014, 55, 1141–1149. [Google Scholar]
  23. Lin, C.H.; Ho, Y.K. Calculation of von Neumann entropy for hydrogen and positronium negative ions. Phys. Lett. A 2014, 378, 2861–2865. [Google Scholar]
  24. Lin, C.H.; Lin, Y.C.; Ho, Y.K. Quantification of linear entropy for quantum entanglement in He, H and Ps ions using highly-correlated Hylleraas functions. Few-Body Syst. 2013, 54, 2147–2153. [Google Scholar]
  25. Lin, Y.C.; Ho, Y.K. Quantum entanglement for two electrons in the excited states of helium-like systems. Can. J. Phys. 2015. [Google Scholar] [CrossRef]
  26. Lin, Y.C.; Lin, C.Y.; Ho, Y.K. Spatial entanglement in two-electron atomic systems. Phys. Rev. A 2013, 87, 022316. [Google Scholar]
  27. Majtey, A.P.; Plastino, A.R.; Dehesa, J.S. The relationship between entanglement, energy and level degeneracy in two-electron systems. J. Phys. A Math. Theor. 2012, 45, 115309. [Google Scholar]
  28. Jaeger, G. Quantum Information—An Overview, 1st ed.; Springer: New York, NY, USA, 2006; Chapter 5; pp. 85–86. [Google Scholar]
  29. Pruski, S.; Mać kowiak, J.; Missuno, O. Reduced density matrices of a system of N coupled oscillators 3. Eigenstructure of the p-particle matric for the ground-state. Rep. Math. Phys. 1972, 3, 241–246. [Google Scholar]
Figure 1. Shannon entropy of whole system for N = 2 to 5.
Figure 1. Shannon entropy of whole system for N = 2 to 5.
Entropy 17 01882f1
Figure 2. p-particle Shannon entropy for fixed N=8, p=1 to 8.
Figure 2. p-particle Shannon entropy for fixed N=8, p=1 to 8.
Entropy 17 01882f2
Figure 3. Mutual information in position space for total particles number N = 8, the black line labeled as square is for partition of (p, N-p) = (1,7), and the pink line labeled as down triangle is for partition (p, N-p) = (4,4), which are the lowest and highest mutual information, respectively, in position space.
Figure 3. Mutual information in position space for total particles number N = 8, the black line labeled as square is for partition of (p, N-p) = (1,7), and the pink line labeled as down triangle is for partition (p, N-p) = (4,4), which are the lowest and highest mutual information, respectively, in position space.
Entropy 17 01882f3
Figure 4. Shannon entropy of the whole system in momentum space, from N = 2 to 5.
Figure 4. Shannon entropy of the whole system in momentum space, from N = 2 to 5.
Entropy 17 01882f4
Figure 5. p-particle Shannon entropy for fixed N = 8, and p = 1 to 8.
Figure 5. p-particle Shannon entropy for fixed N = 8, and p = 1 to 8.
Entropy 17 01882f5
Figure 6. Mutual information in momentum space for total particles number N = 8, the black line labeled as square is for partition of (p, N-p) = (1,7), and the pink line labeled as down triangle is for partition (p, N-p) = (4,4) which are the lowest and highest mutual information, respectively, in momentum space.
Figure 6. Mutual information in momentum space for total particles number N = 8, the black line labeled as square is for partition of (p, N-p) = (1,7), and the pink line labeled as down triangle is for partition (p, N-p) = (4,4) which are the lowest and highest mutual information, respectively, in momentum space.
Entropy 17 01882f6
Figure 7. All the four figures are the case of total number of particles N = 4, and (a), (b), (c), (d) are the cases of p = 1, p = 2, p = 3, p = 4, respectively. The red line labeled as circle is for Shannon entropy in position space, the black line labeled as square is for Shannon entropy in momentum space, and the blue line labeled as up triangle is for the sum of these two quantities.
Figure 7. All the four figures are the case of total number of particles N = 4, and (a), (b), (c), (d) are the cases of p = 1, p = 2, p = 3, p = 4, respectively. The red line labeled as circle is for Shannon entropy in position space, the black line labeled as square is for Shannon entropy in momentum space, and the blue line labeled as up triangle is for the sum of these two quantities.
Entropy 17 01882f7
Figure 8. The comparison of classical and quantum mutual information with total number of particles N = 8. The lower four lines are classical mutual information, and the upper four lines are quantum mutual information.
Figure 8. The comparison of classical and quantum mutual information with total number of particles N = 8. The lower four lines are classical mutual information, and the upper four lines are quantum mutual information.
Entropy 17 01882f8

Share and Cite

MDPI and ACS Style

Peng, H.T.; Ho, Y.K. Statistical Correlations of the N-particle Moshinsky Model. Entropy 2015, 17, 1882-1895. https://doi.org/10.3390/e17041882

AMA Style

Peng HT, Ho YK. Statistical Correlations of the N-particle Moshinsky Model. Entropy. 2015; 17(4):1882-1895. https://doi.org/10.3390/e17041882

Chicago/Turabian Style

Peng, Hsuan Tung, and Yew Kam Ho. 2015. "Statistical Correlations of the N-particle Moshinsky Model" Entropy 17, no. 4: 1882-1895. https://doi.org/10.3390/e17041882

Article Metrics

Back to TopTop