Robust Inference of Dynamic Covariance Using Wishart Processes and Sequential Monte Carlo
Abstract
:1. Introduction
2. Bayesian Inference of Wishart Processes
2.1. Wishart Processes
2.2. Bayesian Inference
2.2.1. Markov Chain Monte Carlo and Variational Inference
2.2.2. A Sequential Monte Carlo Sampler for Wishart Processes
- In the weighting step, particles are assigned a weight based on how well each particle fits the data using
- Next, the particles are resampled with replacement in proportion to their weights. This means that particles with small weights are discarded, and particles with large weights are duplicated.
- Lastly, the particles are mutated by performing a number of Gibbs cycles (see Equation (8)) for each particle, using the tempered distribution as the likelihood. This avoids the risk of all particles receiving identical parameters after a few iterations.
3. Simulation Studies
3.1. Implementation Details
3.2. Synthetic Data
3.3. Performance Metrics
3.4. Simulation Study 1: Learning the Model Parameters
3.5. Simulation Study 2: State Switching and Out-of-Sample Prediction
4. Empirical Application: Dynamic Correlations in Depression
4.1. Dataset, Preprocessing, and Model Choices
4.2. Hypothesis Test for Dynamic Covariance
4.3. Modelling of Dynamic Correlations between Mental States
4.3.1. Dynamics between Mental States over Time
4.3.2. Dynamics between Mental States as a Function of Venlafaxine Dosage
4.3.3. Differences between Dynamics in Mental State Correlations over Time and Dosage
5. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Markov Chain Monte Carlo for Wishart Processes
Appendix B. Variational Wishart Processes
Appendix C. Covariance Process Estimates of the Second Simulation Study
Appendix D. Convergence of the Covariance Process Samples
Appendix E. Multivariate Generalised Autoregressive Conditional Heteroscedastic Models
References
- Lurie, D.J.; Kessler, D.; Bassett, D.S.; Betzel, R.F.; Breakspear, M.; Kheilholz, S.; Kucyi, A.; Liégeois, R.; Lindquist, M.A.; McIntosh, A.R.; et al. Questions and controversies in the study of time-varying functional connectivity in resting fMRI. Netw. Neurosci. 2020, 4, 30–69. [Google Scholar] [CrossRef] [PubMed]
- Calhoun, V.D.; Miller, R.; Pearlson, G.; Adalı, T. The chronnectome: Time-varying connectivity networks as the next frontier in fMRI data discovery. Neuron 2014, 84, 262–274. [Google Scholar] [CrossRef]
- Fornito, A.; Bullmore, E.T. Connectomics: A new paradigm for understanding brain disease. Eur. Neuropsychopharmacol. 2015, 25, 733–748. [Google Scholar] [CrossRef] [PubMed]
- Ledoit, O.; Wolf, M. Improved estimation of the covariance matrix of stock returns with an application to portfolio selection. J. Empir. Financ. 2003, 10, 603–621. [Google Scholar] [CrossRef]
- Borsboom, D. A network theory of mental disorders. World Psychiatry 2017, 16, 5–13. [Google Scholar] [CrossRef] [PubMed]
- Cramer, A.O.; Waldorp, L.J.; Van Der Maas, H.L.; Borsboom, D. Comorbidity: A network perspective. Behav. Brain Sci. 2010, 33, 137–150. [Google Scholar] [CrossRef] [PubMed]
- Schmittmann, V.D.; Cramer, A.O.; Waldorp, L.J.; Epskamp, S.; Kievit, R.A.; Borsboom, D. Deconstructing the construct: A network perspective on psychological phenomena. New Ideas Psychol. 2013, 31, 43–53. [Google Scholar] [CrossRef]
- Liégeois, R.; Li, J.; Kong, R.; Orban, C.; Van De Ville, D.; Ge, T.; Sabuncu, M.R.; Yeo, B.T. Resting brain dynamics at different timescales capture distinct aspects of human behavior. Nat. Commun. 2019, 10, 2317. [Google Scholar] [CrossRef] [PubMed]
- Chen, M.; Li, N.; Zheng, L.; Huang, D.; Wu, B. Dynamic correlation of market connectivity, risk spillover and abnormal volatility in stock price. Phys. A Stat. Mech. Its Appl. 2022, 587, 126506. [Google Scholar] [CrossRef]
- Mollah, S.; Quoreshi, A.S.; Zafirov, G. Equity market contagion during global financial and Eurozone crises: Evidence from a dynamic correlation analysis. J. Int. Financ. Mark. Inst. Money 2016, 41, 151–167. [Google Scholar] [CrossRef]
- Chiang, T.C.; Jeon, B.N.; Li, H. Dynamic correlation analysis of financial contagion: Evidence from Asian markets. J. Int. Money Financ. 2007, 26, 1206–1228. [Google Scholar] [CrossRef]
- Karanasos, M.; Paraskevopoulos, A.G.; Menla Ali, F.; Karoglou, M.; Yfanti, S. Modelling stock volatilities during financial crises: A time varying coefficient approach. J. Empir. Financ. 2014, 29, 113–128. [Google Scholar] [CrossRef]
- Bringmann, L.F.; Pe, M.L.; Vissers, N.; Ceulemans, E.; Borsboom, D.; Vanpaemel, W.; Tuerlinckx, F.; Kuppens, P. Assessing temporal emotion dynamics using networks. Assessment 2016, 23, 425–435. [Google Scholar] [CrossRef]
- Pe, M.L.; Kircanski, K.; Thompson, R.J.; Bringmann, L.F.; Tuerlinckx, F.; Mestdagh, M.; Mata, J.; Jaeggi, S.M.; Buschkuehl, M.; Jonides, J.; et al. Emotion-network density in major depressive disorder. Clin. Psychol. Sci. 2015, 3, 292–300. [Google Scholar] [CrossRef] [PubMed]
- Wichers, M.; Groot, P.C.; Psychosystems, E.; Group, E. Critical slowing down as a personalized early warning signal for depression. Psychother. Psychosom. 2016, 85, 114–116. [Google Scholar] [CrossRef] [PubMed]
- Bollerslev, T. Generalized autoregressive conditional heteroskedasticity. J. Econom. 1986, 31, 307–327. [Google Scholar] [CrossRef]
- Bauwens, L.; Laurent, S.; Rombouts, J.V. Multivariate GARCH models: A survey. J. Appl. Econom. 2006, 21, 79–109. [Google Scholar] [CrossRef]
- Brownlees, C.T.; Engle, R.F.; Kelly, B.T. A practical guide to volatility forecasting through calm and storm. J. Risk 2011, 14, 3–22. [Google Scholar] [CrossRef]
- Hansen, P.R.; Lunde, A. A forecast comparison of volatility models: Does anything beat a GARCH (1, 1)? J. Appl. Econom. 2005, 20, 873–889. [Google Scholar] [CrossRef]
- Sakoğlu, Ü.; Pearlson, G.D.; Kiehl, K.A.; Wang, Y.M.; Michael, A.M.; Calhoun, V.D. A method for evaluating dynamic functional network connectivity and task-modulation: Application to schizophrenia. Magn. Reson. Mater. Physics, Biol. Med. 2010, 23, 351–366. [Google Scholar] [CrossRef] [PubMed]
- Allen, E.A.; Damaraju, E.; Plis, S.M.; Erhardt, E.B.; Eichele, T.; Calhoun, V.D. Tracking whole-brain connectivity dynamics in the resting state. Cereb. Cortex 2014, 24, 663–676. [Google Scholar] [CrossRef] [PubMed]
- Shakil, S.; Lee, C.H.; Keilholz, S.D. Evaluation of sliding window correlation performance for characterizing dynamic functional connectivity and brain states. NeuroImage 2016, 133, 111–128. [Google Scholar] [CrossRef] [PubMed]
- Mokhtari, F.; Akhlaghi, M.I.; Simpson, S.L.; Wu, G.; Laurienti, P.J. Sliding window correlation analysis: Modulating window shape for dynamic brain connectivity in resting state. NeuroImage 2019, 189, 655–666. [Google Scholar] [CrossRef]
- Wilson, A.G.; Ghahramani, Z. Generalised Wishart processes. In Proceedings of the 27th Conference on Uncertainty in Artificial Intelligence, Barcelona, Spain, 14–17 July 2011; pp. 736–744. [Google Scholar]
- Rasmussen, C.E.; Williams, C.K.I. Gaussian Processes for Machine Learning; The MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
- Nejatbakhsh, A.; Garon, I.; Williams, A.H. Estimating noise correlations across continuous conditions with Wishart processes. In Proceedings of the Thirty-seventh Conference on Neural Information Processing Systems, New Orleans, LA, USA, 10–16 December 2023. [Google Scholar]
- Kampman, O.P.; Ziminski, J.; Afyouni, S.; van der Wilk, M.; Kourtzi, Z. Time-varying functional connectivity as Wishart processes. Imaging Neurosci. 2024, 2, 1–28. [Google Scholar] [CrossRef]
- Meng, R.; Yang, F.; Kim, W.H. Dynamic covariance estimation via predictive Wishart process with an application on brain connectivity estimation. Comput. Stat. Data Anal. 2023, 185, 107763. [Google Scholar] [CrossRef]
- Cardona, H.D.V.; Álvarez, M.A.; Orozco, Á.A. Generalized Wishart processes for interpolation over diffusion tensor fields. In Proceedings of the Advances in Visual Computing: 11th International Symposium, ISVC 2015, Las Vegas, NV, USA, 14–16 December 2015; Springer: Berlin/Heidelberg, Germany, 2015; pp. 499–508. [Google Scholar]
- Jørgensen, M.; Deisenroth, M.; Salimbeni, H. Stochastic differential equations with variational wishart diffusions. In Proceedings of the International Conference on Machine Learning, Virtual Event, 13–18 July 2020; pp. 4974–4983. [Google Scholar]
- Heaukulani, C.; van der Wilk, M. Scalable Bayesian dynamic covariance modeling with variational Wishart and inverse Wishart processes. arXiv 2019, arXiv:1906.09360. [Google Scholar]
- Bauer, M.; van der Wilk, M.; Rasmussen, C.E. Understanding probabilistic sparse Gaussian process approximations. In Proceedings of the 30th International Conference on Neural Information Processing Systems, Red Hook, NY, USA, 5–10 December 2016; pp. 1533–1541. [Google Scholar]
- Chopin, N.; Papaspiliopoulos, O. An introduction to Sequential Monte Carlo, 1st ed.; Springer Series in Statistics; Springer: Cham, Switzerland, 2020. [Google Scholar]
- Del Moral, P.; Doucet, A.; Jasra, A. Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. Stat. Methodol. 2006, 68, 411–436. [Google Scholar] [CrossRef]
- Kantas, N.; Doucet, A.; Singh, S.; Maciejowski, J. An overview of Sequential Monte Carlo methods for parameter estimation in general state-space models. IFAC Proc. Vol. 2009, 42, 774–785. [Google Scholar] [CrossRef]
- Speich, M.; Dormann, C.F.; Hartig, F. Sequential Monte-Carlo algorithms for Bayesian model calibration—A review and method comparison. Ecol. Model. 2021, 455, 109608. [Google Scholar] [CrossRef]
- Wills, A.G.; Schön, T.B. Sequential Monte Carlo: A unified review. Annu. Rev. Control. Robot. Auton. Syst. 2023, 6, 159–182. [Google Scholar] [CrossRef]
- Bru, M.F. Wishart processes. J. Theor. Probab. 1991, 4, 725–751. [Google Scholar] [CrossRef]
- Zhang, Z. A note on Wishart and inverse Wishart priors for covariance matrix. J. Behav. Data Sci. 2021, 1, 119–126. [Google Scholar] [CrossRef]
- Geman, S.; Geman, D. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. 1984, PAMI-6, 721–741. [Google Scholar] [CrossRef]
- Hensman, J.; Fusi, N.; Lawrence, N.D. Gaussian processes for big data. In Proceedings of the 29th Conference on Uncertainty in Artificial Intelligence, Bellevue, WA, USA, 11–13 July 2013; pp. 282–290. [Google Scholar]
- Jasra, A.; Stephens, D.A.; Doucet, A.; Tsagaris, T. Inference for Lévy-driven stochastic volatility models via adaptive sequential Monte Carlo. Scand. J. Stat. 2011, 38, 1–22. [Google Scholar] [CrossRef]
- Agapiou, S.; Papaspiliopoulos, O.; Sanz-Alonso, D.; Stuart, A.M. Importance sampling: Intrinsic dimension and computational cost. Stat. Sci. 2017, 32, 405–431. [Google Scholar] [CrossRef]
- Herbst, E.; Schorfheide, F. Sequential Monte Carlo sampling for DSGE models. J. Appl. Econom. 2014, 29, 1073–1098. [Google Scholar] [CrossRef]
- Cabezas, A.; Corenflos, A.; Lao, J.; Louf, R. BlackJAX: Composable Bayesian inference in JAX. 2024. arXiv arXiv:2402.10797.
- Bradbury, J.; Frostig, R.; Hawkins, P.; Johnson, M.J.; Leary, C.; Maclaurin, D.; Necula, G.; Paszke, A.; VanderPlas, J.; Wanderman-Milne, S.; et al. JAX: Composable Transformations of Python+NumPy Programs. 2018. Available online: https://github.com/google/jax (accessed on 10 March 2024).
- Gelman, A.; Rubin, D.B. Inference from iterative simulation using multiple sequences. Stat. Sci. 1992, 7, 457–472. [Google Scholar] [CrossRef]
- Matthews, A.G.d.G.; van der Wilk, M.; Nickson, T.; Fujii, K.; Boukouvalas, A.; León-Villagrá, P.; Ghahramani, Z.; Hensman, J. GPflow: A Gaussian process library using TensorFlow. J. Mach. Learn. Res. 2017, 18, 1–6. [Google Scholar]
- Kingma, D.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the International Conference on Learning Representations (ICLR), San Diega, CA, USA, 7–9 May 2015. [Google Scholar]
- Scott, D.W. Multivariate Density Estimation: Theory, Practice, and Visualization; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Cabrieto, J.; Adolf, J.; Tuerlinckx, F.; Kuppens, P.; Ceulemans, E. An objective, comprehensive and flexible statistical framework for detecting early warning signs of mental health problems. Psychother. Psychosom. 2019, 88, 184–186. [Google Scholar] [CrossRef]
- Bringmann, L.F.; Vissers, N.; Wichers, M.; Geschwind, N.; Kuppens, P.; Peeters, F.; Borsboom, D.; Tuerlinckx, F. A network approach to psychopathology: New insights into clinical longitudinal data. PLoS ONE 2013, 8, e60188. [Google Scholar] [CrossRef] [PubMed]
- Kossakowski, J.J.; Groot, P.C.; Haslbeck, J.M.; Borsboom, D.; Wichers, M. Data from ‘critical slowing down as a personalized early warning signal for depression’. J. Open Psychol. Data 2017, 5, 1. [Google Scholar] [CrossRef]
- Derogatis, L.R.; Rickels, K.; Rock, A.F. The SCL-90 and the MMPI: A step in the validation of a new self-report scale. Br. J. Psychiatry 1976, 128, 280–289. [Google Scholar] [CrossRef] [PubMed]
- Benton, G.; Maddox, W.; Wilson, A.G. Volatility based kernels and moving average means for accurate forecasting with Gaussian processes. In Proceedings of the International Conference on Machine Learning, Baltimore, MD, USA, 17–23 July 2022; pp. 1798–1816. [Google Scholar]
- Kruschke, J.K. Rejecting or accepting parameter values in Bayesian estimation. Adv. Methods Pract. Psychol. Sci. 2018, 1, 270–280. [Google Scholar] [CrossRef]
- Galanos, A. Rmgarch: Multivariate GARCH Models, R package version 1.3-6; 2019. Available online: https://cran.r-project.org/web/packages/rmgarch/ (accessed on 10 March 2024).
- Yao, Y.; Vehtari, A.; Gelman, A. Stacking for non-mixing Bayesian computations: The curse and blessing of multimodal posteriors. J. Mach. Learn. Res. 2022, 23, 1–45. [Google Scholar]
- Lalchand, V.; Rasmussen, C.E. Approximate inference for fully Bayesian Gaussian process regression. In Proceedings of the Symposium on Advances in Approximate Bayesian Inference, Vancouver, BC, Canada, 8 December 2019; pp. 1–12. [Google Scholar]
- Svensson, A.; Dahlin, J.; Schön, T.B. Marginalizing Gaussian process hyperparameters using sequential Monte Carlo. In Proceedings of the 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), Cancun, Mexico, 13–16 December 2015; pp. 477–480. [Google Scholar]
- Sridhar, V.H.; Davidson, J.D.; Twomey, C.R.; Sosna, M.M.; Nagy, M.; Couzin, I.D. Inferring social influence in animal groups across multiple timescales. Philos. Trans. R. Soc. B 2023, 378, 20220062. [Google Scholar] [CrossRef] [PubMed]
- Gilmore, J.H.; Knickmeyer, R.C.; Gao, W. Imaging structural and functional brain development in early childhood. Nat. Rev. Neurosci. 2018, 19, 123–137. [Google Scholar] [CrossRef] [PubMed]
- Xifara, T.; Sherlock, C.; Livingstone, S.; Byrne, S.; Girolami, M. Langevin diffusions and the Metropolis-adjusted Langevin algorithm. Stat. Probab. Lett. 2014, 91, 14–19. [Google Scholar] [CrossRef]
- Neal, R.M. MCMC using Hamiltonian dynamics. Handb. Markov Chain. Monte Carlo 2010, 54, 113–162. [Google Scholar]
- Wilson, A.; Adams, R. Gaussian process kernels for pattern discovery and extrapolation. In Proceedings of the International Conference on Machine Learning, Atlanta, GA, USA, 16–21 June 2013; pp. 1067–1075. [Google Scholar]
- Rossi, S.; Heinonen, M.; Bonilla, E.; Shen, Z.; Filippone, M. Sparse Gaussian processes revisited: Bayesian approaches to inducing-variable approximations. In Proceedings of the International Conference on Artificial Intelligence and Statistics, Virtual, 13–15 April2021; pp. 1837–1845. [Google Scholar]
- Rowe, D.B. Multivariate Bayesian Statistics: Models for Source Separation and Signal Unmixing; Chapman and Hall/CRC: Boca Raton, FL, USA, 2002. [Google Scholar]
- Wilson, A.; Nickisch, H. Kernel interpolation for scalable structured Gaussian processes (KISS-GP). In Proceedings of the International Conference on Machine Learning, Lille, France, 6–11 July 2015; pp. 1775–1784. [Google Scholar]
- Cunningham, J.P.; Shenoy, K.V.; Sahani, M. Fast Gaussian process methods for point process intensity estimation. In Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland, 5–9 July 2008; pp. 192–199. [Google Scholar]
- Wang, Y.; Kang, J.; Kemmer, P.B.; Guo, Y. An efficient and reliable statistical method for estimating functional connectivity in large scale brain networks using partial correlation. Front. Neurosci. 2016, 10, 179959. [Google Scholar] [CrossRef] [PubMed]
- Smith, S.M.; Miller, K.L.; Salimi-Khorshidi, G.; Webster, M.; Beckmann, C.F.; Nichols, T.E.; Ramsey, J.D.; Woolrich, M.W. Network modelling methods for FMRI. Neuroimage 2011, 54, 875–891. [Google Scholar] [CrossRef] [PubMed]
- Hinne, M.; Ambrogioni, L.; Janssen, R.J.; Heskes, T.; van Gerven, M.A. Structurally-informed Bayesian functional connectivity analysis. NeuroImage 2014, 86, 294–305. [Google Scholar] [CrossRef] [PubMed]
- Murray, I.; Adams, R.; MacKay, D. Elliptical slice sampling. In Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, Chia Laguna Resort, Sardinia, Italy, 13–15 May 2010; pp. 541–548. [Google Scholar]
- Kullback, S.; Leibler, R. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Bollerslev, T.; Engle, R.F.; Wooldridge, J.M. A capital asset pricing model with time-varying covariances. J. Political Econ. 1988, 96, 116–131. [Google Scholar] [CrossRef]
- Box, G.E.; Jenkins, G.M.; Reinsel, G.C.; Ljung, G.M. Time Series Analysis: Forecasting and Control; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Engle, R. Dynamic conditional correlation: A simple class of multivariate generalized autoregressive conditional heteroskedasticity models. J. Bus. Econ. Stat. 2002, 20, 339–350. [Google Scholar] [CrossRef]
Method | Runtime (Minutes) | ||||
---|---|---|---|---|---|
MCMC | 0.45 ± 0.36 | 0.00 ± 0.00 | 0.28 ± 0.26 | 0.59 ± 0.51 | 308.29 ± 7.13 |
VI | 0.44 ± 0.29 | 0.10 ± 0.05 | 0.55 ± 0.36 | 0.51 ± 0.37 | 54.84 ± 0.82 |
SMC | 0.45 ± 0.35 | 0.00 ± 0.00 | 0.31 ± 0.28 | 0.61 ± 0.52 | 105.40 ± 2.08 |
Method | Training | Out-of-Sample Prediction | Runtime (Minutes) | |||||
---|---|---|---|---|---|---|---|---|
LL | KL | |||||||
Periodic function | ||||||||
MCMC | 0.08 ± 0.02 | 0.13 ± 0.01 | 0.19 ± 0.02 | 0.34 ± 0.07 | −4.10 ± 0.07 | 0.52 ± 0.07 | 280.88 ± 4.13 | |
VI | 0.06 ± 0.02 | 0.08 ± 0.02 | 0.14 ± 0.09 | 0.22 ± 0.15 | −4.01 ± 0.16 | 0.47 ± 0.20 | 30.72 ± 7.19 | |
SMC | 0.04 ± 0.01 | 0.06 ± 0.01 | 0.11 ± 0.08 | 0.13 ± 0.09 | −3.94 ± 0.21 | 0.48 ± 0.40 | 152.49 ± 2.11 | |
LP function | ||||||||
MCMC | 0.05 ± 0.01 | 0.08 ± 0.01 | 0.12 ± 0.04 | 0.24 ± 0.08 | −3.98 ± 0.13 | 0.42 ± 0.10 | 291.20 ± 3.97 | |
VI | 0.05 ± 0.01 | 0.08 ± 0.02 | 0.10 ± 0.04 | 0.17 ± 0.08 | −3.97 ± 0.15 | 0.43 ± 0.14 | 31.62 ± 6.43 | |
SMC | 0.04 ± 0.01 | 0.06 ± 0.01 | 0.09 ± 0.02 | 0.19 ± 0.04 | −3.93 ± 0.08 | 0.36 ± 0.06 | 154.98 ± 2.97 |
DCC-GARCH | Wishart Process | |||
---|---|---|---|---|
MCMC | VI | SMC | ||
−6.19 ± 2.75 | −5.29 ± 2.85 | −7.29 ± 4.21 | −5.82 ± 3.39 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huijsdens, H.; Leeftink, D.; Geerligs, L.; Hinne, M. Robust Inference of Dynamic Covariance Using Wishart Processes and Sequential Monte Carlo. Entropy 2024, 26, 695. https://doi.org/10.3390/e26080695
Huijsdens H, Leeftink D, Geerligs L, Hinne M. Robust Inference of Dynamic Covariance Using Wishart Processes and Sequential Monte Carlo. Entropy. 2024; 26(8):695. https://doi.org/10.3390/e26080695
Chicago/Turabian StyleHuijsdens, Hester, David Leeftink, Linda Geerligs, and Max Hinne. 2024. "Robust Inference of Dynamic Covariance Using Wishart Processes and Sequential Monte Carlo" Entropy 26, no. 8: 695. https://doi.org/10.3390/e26080695