Exploring the Intersection of Statistical Estimation Theory and Machine Learning

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Probability and Statistics".

Deadline for manuscript submissions: 31 January 2025 | Viewed by 969

Special Issue Editor


E-Mail Website
Guest Editor
Oak Ridge National Laboratory, Oak Ridge, TN 37830, USA
Interests: statistics; modeling; radiation detection

Special Issue Information

Dear Colleagues,

Machine learning methods in their various incarnations have become ubiquitous in nearly every branch of mathematical sciences, engineering, and even popular culture. These methods are deeply tied to the theory of random variables, and many techniques from statistical estimation theory have informed the growth and development of machine learning. Concepts such as fundamental mathematical statistics, Bayesian estimation theory, and information geometry have obvious and very intimate ties to the field of machine learning. The reverse is also true, with the development of tools such as Bayesian networks, deep Gaussian processes, and probabilistic programming making many of the rich results of statistical estimation theory feasibly applicable to an ever growing range of practical tasks. In this issue, we seek to solicit articles exploring the relationship between the two fields, with a special emphasis on techniques that lend rigor, insight, and depth to the often semi-empirical field of machine learning.

We also welcome papers with an applied focus that combine statistical estimation techniques with ML to solve new and interesting problems in domain-specific applications.

Dr. Jason M. Hite
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • estimation theory
  • machine learning
  • statistics

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 1525 KiB  
Article
Contrastive Machine Learning with Gamma Spectroscopy Data Augmentations for Detecting Shielded Radiological Material Transfers
by Jordan R. Stomps, Paul P. H. Wilson and Kenneth J. Dayman
Mathematics 2024, 12(16), 2518; https://doi.org/10.3390/math12162518 - 15 Aug 2024
Viewed by 723
Abstract
Data analysis techniques can be powerful tools for rapidly analyzing data and extracting information that can be used in a latent space for categorizing observations between classes of data. Machine learning models that exploit learned data relationships can address a variety of nuclear [...] Read more.
Data analysis techniques can be powerful tools for rapidly analyzing data and extracting information that can be used in a latent space for categorizing observations between classes of data. Machine learning models that exploit learned data relationships can address a variety of nuclear nonproliferation challenges like the detection and tracking of shielded radiological material transfers. The high resource cost of manually labeling radiation spectra is a hindrance to the rapid analysis of data collected from persistent monitoring and to the adoption of supervised machine learning methods that require large volumes of curated training data. Instead, contrastive self-supervised learning on unlabeled spectra can enhance models that are built on limited labeled radiation datasets. This work demonstrates that contrastive machine learning is an effective technique for leveraging unlabeled data in detecting and characterizing nuclear material transfers demonstrated on radiation measurements collected at an Oak Ridge National Laboratory testbed, where sodium iodide detectors measure gamma radiation emitted by material transfers between the High Flux Isotope Reactor and the Radiochemical Engineering Development Center. Label-invariant data augmentations tailored for gamma radiation detection physics are used on unlabeled spectra to contrastively train an encoder, learning a complex, embedded state space with self-supervision. A linear classifier is then trained on a limited set of labeled data to distinguish transfer spectra between byproducts and tracked nuclear material using representations from the contrastively trained encoder. The optimized hyperparameter model achieves a balanced accuracy score of 80.30%. Any given model—that is, a trained encoder and classifier—shows preferential treatment for specific subclasses of transfer types. Regardless of the classifier complexity, a supervised classifier using contrastively trained representations achieves higher accuracy than using spectra when trained and tested on limited labeled data. Full article
Show Figures

Figure 1

Back to TopTop