Reprint

Information Theory and Machine Learning

Edited by
September 2022
254 pages
  • ISBN978-3-0365-5307-8 (Hardback)
  • ISBN978-3-0365-5308-5 (PDF)

This is a Reprint of the Special Issue Information Theory and Machine Learning that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

The recent successes of machine learning, especially regarding systems based on deep neural networks, have encouraged further research activities and raised a new set of challenges in understanding and designing complex machine learning algorithms. New applications require learning algorithms to be distributed, have transferable learning results, use computation resources efficiently, convergence quickly on online settings, have performance guarantees, satisfy fairness or privacy constraints, incorporate domain knowledge on model structures, etc. A new wave of developments in statistical learning theory and information theory has set out to address these challenges. This Special Issue, "Machine Learning and Information Theory", aims to collect recent results in this direction reflecting a diverse spectrum of visions and efforts to extend conventional theories and develop analysis tools for these complex machine learning systems.

Format
  • Hardback
License and Copyright
© 2022 by the authors; CC BY-NC-ND license
Keywords
supervised classification; independent and non-identically distributed features; analytical error probability; empirical risk; generalization error; K-means clustering; model compression; population risk; rate distortion theory; vector quantization; generalization error; overfitting; information criteria; entropy; model-based clustering; merging mixture components; component overlap; interpretability; time series prediction; finite state machines; hidden Markov models; recurrent neural networks; reservoir computers; long short-term memory; deep neural network; information theory; local information geometry; feature extraction; spiking neural network; meta-learning; information theoretic learning; minimum error entropy; artificial general intelligence; closed-loop transcription; linear discriminative representation; rate reduction; minimax game; fairness; HGR maximal correlation; independence criterion; separation criterion; pattern dictionary; atypicality; Lempel–Ziv algorithm; lossless compression; anomaly detection; generalization error; information-theoretic bounds; distribution and federated learning

Related Books

May 2024

Applied Mathematics and Machine Learning

Computer Science & Mathematics
April 2024

Machine Learning in Communication Systems and Networks

Computer Science & Mathematics
...
March 2019

Information Theory in Neuroscience

Biology & Life Sciences
January 2021

Information Theory for Data Communications and Processing

Computer Science & Mathematics
...
January 2024

Data Mining and Machine Learning with Applications

Computer Science & Mathematics