entropy-logo

Journal Browser

Journal Browser

Applications of Statistical Methods and Machine Learning in the Space Sciences

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Astrophysics, Cosmology, and Black Holes".

Deadline for manuscript submissions: closed (10 September 2023) | Viewed by 2592

Special Issue Editor

The Division of Physics, Mathematics and Astronomy, California Institute of Technology, Pasadena, CA 91125, USA
Interests: gravitational wave sources; gravitational wave detectors; LIGO; general relativity; close binary stars; trinary stars; neutron stars; white dwarfs; exoplanets; exoplanet tides; hot Jupiters; deep learning; time series analysis

Special Issue Information

Dear Colleagues,

Contemporary astronomy and cosmology have arrived in an era of big data, as new results are constantly delivered by missions across scales, ranging from table-top experiments to space-borne observatories. To maximize the information encoded in the data and shed light on the underlying physical laws, it requires us to develop novel analysis methods utilizing cutting-edge statistical and machine learning approaches. 

In this Special Issue, we would like to explore the applications of statistical and machine learning methods in astronomy and space sciences, broadly defined. We are interested in both the developments in analysis techniques including but not limited to entropy and information theory, Bayesian statistics, pattern recognition, intelligent sampling, and neural networks, and their applications in a variety of systems and phenomena, such as gravitational-wave events, accretion disc variabilities, supernova explosions, pulsating stars, eclipsing binaries, exoplanets, and many more. The goal of this Special Issue is to create a strong synergy among researchers across different fields by sharing and discussing their latest research breakthroughs in tackling challenges in the big data era.

Dr. Hang Yu
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  •  information theory
  •  big data
  •  bayesian statistics
  •  machine learning
  •  neural networks
  •  transient sources
  •  time domain astronomy
  •  high energy astrophysics
  •  gravitational wave astronomy
  •  exoplanet astronomy

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 2093 KiB  
Article
Statistical Significance Testing for Mixed Priors: A Combined Bayesian and Frequentist Analysis
by Jakob Robnik and Uroš Seljak
Entropy 2022, 24(10), 1328; https://doi.org/10.3390/e24101328 - 21 Sep 2022
Viewed by 2021
Abstract
In many hypothesis testing applications, we have mixed priors, with well-motivated informative priors for some parameters but not for others. The Bayesian methodology uses the Bayes factor and is helpful for the informative priors, as it incorporates Occam’s razor via the multiplicity or [...] Read more.
In many hypothesis testing applications, we have mixed priors, with well-motivated informative priors for some parameters but not for others. The Bayesian methodology uses the Bayes factor and is helpful for the informative priors, as it incorporates Occam’s razor via the multiplicity or trials factor in the look-elsewhere effect. However, if the prior is not known completely, the frequentist hypothesis test via the false-positive rate is a better approach, as it is less sensitive to the prior choice. We argue that when only partial prior information is available, it is best to combine the two methodologies by using the Bayes factor as a test statistic in the frequentist analysis. We show that the standard frequentist maximum likelihood-ratio test statistic corresponds to the Bayes factor with a non-informative Jeffrey’s prior. We also show that mixed priors increase the statistical power in frequentist analyses over the maximum likelihood test statistic. We develop an analytic formalism that does not require expensive simulations and generalize Wilks’ theorem beyond its usual regime of validity. In specific limits, the formalism reproduces existing expressions, such as the p-value of linear models and periodograms. We apply the formalism to an example of exoplanet transits, where multiplicity can be more than 107. We show that our analytic expressions reproduce the p-values derived from numerical simulations. We offer an interpretation of our formalism based on the statistical mechanics. We introduce the counting of states in a continuous parameter space using the uncertainty volume as the quantum of the state. We show that both the p-value and Bayes factor can be expressed as an energy versus entropy competition. Full article
Show Figures

Figure 1

Back to TopTop